Yeah the problem is defining "West". Everybody seems to have difference opinions of what that means. You often hear people say it's about European vs Asian values but what about Africa's influence in both these places? Or the indigenous populations of America and Oceania, where do they fall under?
Also, a lot of people who support Western Civilization often omit South America which is home to as much European ancestry as North America. The whole thing is absurd.
RE: Is the West better than the East?