I like how people from Europe post on Youtube talk about how they believe US has been going to war for hundreds and hundreds years when Europe Nations have been at war amongst themselves forever. Or how I see people from the UK post about how US troops just kill everyone and see their country as a shiny beacon of innocents. I guess the British weren't the ones who tried to ensalve the Native Americans or just kill them off for land, or bring slaves to the American Continent, or wipe out African villages and then crush any rebellions to keep the native populations under control so they could establish more colonies. It was totally the Oil taking Americans who did those things, not the British Empire.