I think Europe is better because they: have an actual culture which isn't just sellout cheap imitations of other cultures, they know how to keep to their damn business, and because Europe isn't run by hypocritical Fascist warmongers.
Ah, and not to mention that European cities are a billion times better to just stroll around than American cities.
-
Brits and French have always been saved by America. There are only a couple countries in Europe I like better.