I just find it very obnoxious when i hear American, especially celebrities, especially politicians, say that the US is the greatest country in the world and whip up a frenzy.
To those of you that believe it is, i ask you, how do you know that it is?
Have you actually traveled to, and lived on every single country in the world?
If the answer is no, how can you possible say it's the greatest country.
Paraphrasing one of my favorite stand up comedians Lewis Black, saying that, is like the work colleague constantly saying they are the greatest employee and constantly telling you they are. At some point you'd want to slap that person silly.
You Americans saying that, are just like that work colleague, it is very annoying and obnoxious.
Maybe, just maybe other countries have things and do things that you as a country may want to take on board, such as free healthcare, 4 day work weeks, not spending billions on military.
Just a bit of food for thought.