So many of you seem to hate the US, and a lot of you talk about how much worse the US is compared to other countries.
So I'm curious, which countries do you think are better than America, and why?
-
I'm from the United States so I will always love it and consider it my "home country" but I have a particular liking for the Gulf Arab states, mainly Qatar. I don't necessarily think it's better, though.