I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
I’m German and went to the US for a year as a high school student.
My US history teacher literally told us that the US is the world police. Because of that I believe that many Americans think that way.
Kinda how they were “last man standing” in WW2. Everybody else got severely fucked and they won them over by with the Marshall aid program which got us to a bi-polar world with NATO in which the US was the hegemony.
After the fall of the Soviet Union and before the rise of China there was only one superpower that could act as such militarily and then US continued their power trip.
Nobody gave ‘murica any right, they just imposed themselves. The simple answer is imperialism. USA was always a power and money hungry bitch and has been putting nations, populations and markets under their boots (not always thru military force) for profit since the late 1800s. Yes, they’ve been an evil empire for that long. Latin America as a whole has suffered many hells so uncle sam could keep commodities’ prices super low.
I think you’re setting the timeline about a century and a half too late. The people that would become the first US citizens were genociding the Native Americans as early as 1750
Pretty much when the US was the only super power to survive WWII unscathed.
Also, having developed atomic hellfire, and the will to use it (twice), kinda makes you the big kid on the playground.
This right here. The US was isolationist prior to WWII but then got attacked and drawn in to active war.
Since the mainland of US was untouched by war directly, and industry boomed post depression and during the war they came out of it better off than Europe, which had a lot of rebuilding to do.
As a result of the war and the need for defense they established bases all across the globe and for the last 80-90 years as the political system grew more corrupt the increase of American hegemony followed.
You sure are an american since you dont know your own history.
After world war two, Europe was busy putting itself back together. It left an opening that the US stepped into. And who wouldn’t like to be the big dog in the yard.
Pretty much this. Up to that point, it was Britain and a few other European nations that were doing all the management* in various places in the world. After WWII, they realised: “You know what, we’re tired and worn out and everyone wants us out anyway. We’re going low energy to rebuild at home. Someone else can step in if they want.”
* a.k.a. “Colonialism”. Management is an odd choice of synonym I grant you, but once you’ve got a colony, it’s in your interests to run things in good order. Until the locals rightfully kick you out, that is.
Because people in power only want one thing - more power. They only fear one thing - loosing power.
*Losing
America was the standard for a Democratic Republic after WW2.
after the war we helped most of Europe return to normal and even improved quality of life and living standards. part of that help came with stipulations on how the US had control within those countries that had help.
Had the US not stepped in at the time to stabilize Europe, another war would have likely happened and another, and another.
My guess, most of Europe would have fallen under Russian rule, or at the very least heavily influenced by, if the US didn’t step up.
I suppose European’s don’t look at how bad the war left Europe and often just want to forget the atrocities, but that’s not an excuse for blaming the hand that helped you in your time of need.
People turned to Russia specifically because they disapproved US imperialism and wanted to counter its power, while avoiding being doomed by capitalism. I’m not saying this was the ideal solution, but at least if they succeeded we wouldn’t be in the position we are today
US imperialism didn’t happen until the 1950s, well after the war.
this was, in part, due to the private investments from large American companies at the time. in-fact, the American economy was booming for three reasons
- war was over and people were desperate to find stability and peace
- Americans at home got through the war mostly unscathed and now had an abundance of work which in-turn made an abundance of money to spend
- Europe desperately needed materials and products to rebuild their own economy, this only further boosted American GDP from a previously untouched market. private investment took place from American companies within Europe to increase profits further.
in a sense, because Europe was so weak after the war it only fed US corporate imperialism. Had Europe been able to stand on its own the United States might not have had such an industrial boon and similarities between Europe and the US might have not been so significant.
one might even draw the strong correlation between American corporate interests and total subservience of government alliances at that time. our government had, up until then, mostly stayed neutral to concerns between corporations and citizens. this changed though because of the newly created military industrial complex that was created to feed the war. afterwards you had defense contractors that saw dollar signs, and the tradition still goes to this day.
speculation on my part, the political climate of the current day is the fruit bore from that union of corporate and state all those years ago and this has been the agenda of the American elite all along and they are currently in the final seconds of the “game of thrones”.
US imperialism didn’t happen until the 1950s, well after the war.
Absolute whitewashing of the USian crimes against humanity all over the first half of the 20th century. Examples: big stick ideology
The US also constantly did shit like this in the Americas all over the 19th century, see United Fruit Company or Military Government of Cuba.
It mostly started with the cold war. The US was obsessed with stopping the perceived threat of communism. In the process, it discovered the benefits of power mongering and war profiteering.
It most definitely did not start in the cold war. The US was happily invading and controlling the politics of as much of the continent of the Americas well before WW2, with stuff such as the United Fruit Company or the Big Stick Ideology. The 1898 invasion of Cuba and establishment of a military junta comes to mind.