The act of bringing democracy and western way of living into developing countries, is that legitimate action the western countries should pursued ? To this day and in the past, we've seen that forcing western thoughts into other cultures resulted in a failure. Then the first question that comes to my mind is, what does the west actually want ? An unstable world ? Which then would benefit the wealthy elites(i.e through international arms trade). Someone needs to continue adding to their wealth right ? In addition, conflict means money. It's always been better to have developing countries for the West; hence, its a source that they can exploit through various means. The very basic action the Western countries take on is changing the local government, which is promoted to the world as a step towards a democratic regime. However; these men not so suprisingly end up as selfish puppet leaders.
Nonetheless, maybe we need to adapt a toleration policy rather than forcing global justice, the kind of justice that is understood by the citizens of the west. The key idea could be built around, "To Live & Let Live".