Your take
Ok, I give up!
You tell me, is the West morally superior? Why or why not?
By the West, I have been told, it doesn't mean the "white man" but the West - the Western civilization. That civilization that is broadly speaking Judeo-Christian, if you are more inclined to trace your roots religiously. Or, that civilization traces it from the Greeks and the Romans, (and that which adopted some kind of Judeo-Christian ideas and westernized it. Ouch! I am coloring your thinking with my bias already!)
Ok, define your own Western civilization! Let's call it the "broadly speaking Western civilization" that is known in the now USA/Canada, Great Britian and/or Europe (but, then if you are true blue American, you very likely want to divorce yourself from those pesky snobbish folks across the Atlantic, but that is another story, but if you are an adherent of that story, then just remove them from your definition of "western"), Australian/NZ, and perhaps, the erstwhile, "white" South Africa. (Some might even want to include the Anglophilic Indian and Singaporean/Malaysian/Hong Kong culture, but then that might be just a tad too broad for most people, and probably only accepted by the most liberal (again, "broadly" speaking) of bona fide Westerners.
But you get my drift.
So, why don't we do this.
Please tell me:
(1) Is the West Morally Superior?
(2) Why?
(3) And, if you are still with me, can you please tell me who you would include in the "West"?