I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
It mostly started with the cold war. The US was obsessed with stopping the perceived threat of communism. In the process, it discovered the benefits of power mongering and war profiteering.
It most definitely did not start in the cold war. The US was happily invading and controlling the politics of as much of the continent of the Americas well before WW2, with stuff such as the United Fruit Company or the Big Stick Ideology. The 1898 invasion of Cuba and establishment of a military junta comes to mind.