The United States of America is without a doubt the most influential nation on the planet, and fair or not, that influence has led to numerous things the world can’t stand about Americans. Many of these things could be classified as stereotypes of how Americans act and think, and whether true or not, those perceptions are now unshakable.
As you’ll see on the list, one of those things is the general ignorance and disregard Americans are seen as having for the rest of the world, a topic I previously touched on somewhat in a piece entitled 9 out of 10 Americans Are Completely Wrong About These Mind Blowing Facts. Of the items that made that list, the most mind-blowing is without a doubt their ignorance over the scientific certainty of evolution.
There is an alarming level of misinformation and pseudo-science being taught in the U.S by groups with their own agendas, and the result is a populace that is woefully ignorant and misinformed on some subjects in comparison to other first-world countries.
What other things made the list of things the world can’t stand about Americans? Read on to find out.
10. Calling Football as Soccer
Seem trivial? Well, for many fans of the “beautiful game”, the audacity of Americans to call their favored sport by another name, while using the football term for one of their own completely unrelated sports, is enraging. As they’ll point out, the sport Americans do call football is not even played with the feet (apart from kickoffs, punts, and field goal attempts that is).
The American affronts only get worse from here, as we countdown the 10 things the world can’t stand about Americans, continuing on the next page.
9. American Spellings
Did Americans really have to change some of the spellings of perfectly fine English words, and create a division in the language between American and British spelling and English? According to the many people who are annoyed by their various alterations to the language, no; no they didn’t.
8. Their Perception of Others
Yes, there is a perception that Americans’ perception of others is unfair and stereotypical. As ironic as it seems, the prevalence of American movies and culture makes it a fair enough point, though a murky one nonetheless. After all, any viewpoints expressed in Hollywood films are considered “American”, even though the film’s writer, director, and/or stars may not actually be American at all.
7. Indifference to Anything Outside Their Country
As mentioned in the opening, Americans tend to be rather ignorant of the world outside their country, but more than that, they just don’t really care about it. They tend to have a very us against the world mentality and see themselves as separate from or even above what happens elsewhere in the world, rather than as being part of a connected global community.
6. Thinking Everyone Wishes They Were American
Americans tend to think very highly of themselves and their country, and believe that if given the chance, anyone would wish to have been born as an American and/or to live in America. Naturally, that is extremely insulting to most people, many of whom absolutely want nothing to do with the country and its culture, and find such ideas absurd and delusional.
5. Belief That They’re the Only Country with True Freedoms
Many Americans truly seem to think they’re one of the only countries in the world that has freedom. The reality is not only that dozens of other countries have the same complete freedom Americans boast about, but also that theirs is not being encroached upon by their government like America’s is. Freedom of the press has tumbled to 46th in the U.S, while drones fill the skies, and the local and federal governments continue to overstep their bounds in thousands of cases around the country, raising cries of America going down the road of becoming a police state.
4. Exportation of Their Trashy Culture
American culture has permeated the globe like that of no other, thanks to the economic clout of the country. This allowed them to build a massive entertainment industry, and several other major brands and chains that all had the means to expand to various corners of the world. That culture has now seeped into the cultures of dozens of other countries, where it’s rarely seen as a good thing.
3. Global Warming Skepticism
It is virtually unanimously accepted as fact that global warming is real and being caused in part by man’s activities; except in America that is, where the political right has turned it into a giant conspiracy theory and money-making scheme perpetrated on Americans by Al Gore and the political left. As a result, polling in the US shows a clear divide in global warming beliefs along political lines. All of this has done nothing to help themselves or the rest of the world tackle the issue, all for the sake of scoring political points, and the rest of the world just shakes their heads.
2. Their Overstated Influence
Americans tend to take a lot of credit for the world not being a completely anarchic place ruled by tyrants who would otherwise crush the planet under their heels if not for America keeping everyone safe. Thinking that the world owes them all an endless debt of gratitude instead tends to engender a lot of animosity.
1. Their Drain on the World’s Resources
Despite accounting for only 5% of the world’s population, Americans consume 26% of its energy and release 23% of its carbon emissions. They consume 25% more calories daily than they need to, while also throwing out 200,000 tons of edible food daily, and have the highest rate of water usage per capita in the developed world. Their wasteful, consumerism-driven society produces so much refuse that they have to export some of it to other countries. It’s a model that could have catastrophic implications for the entire world in the future, as the leading thing the rest of the world can’t stand about Americans.