When did Empire become the "E" Word?

stevev

Ad Honorem
Apr 2017
2,784
Las Vegas, NV USA
#11
Japan remains an empire. It's head of state has the title Emperor but is purely ceremonial. The final surrender settlement allowed the incumbent monarch to keep his title with the view that Japan would be easier to govern under occupation. It was controversial but in hindsite it was probably the right move. Emperor Hirohito signed the surrender document on the US Battleship Missouri .
 

Theodoric

Ad Honorem
Mar 2012
2,625
#12
For man years, nations proudly declared themselves to be Empires: Russia, Britain, France, and nearly every major power within and without Europe took up the mantle of being an "Empire" at some point.

The question is, when did being an "Empire" become taboo? When you say the word "Empire" to your average layman, they probably think of Star Wars or a TV show about the recording industry. However, I would guess that Empire started becoming taboo during or soon after WWII or the Suez Crisis.
If you are talking about recent history in Western society, it is because in Western culture during the cold war the word "Empire" was associated with the USSR, and it is derived from the Russian Empire. As a trope for evil enemy nations, early instances occur in the 1960s with Star Trek's Klingon Empire and Dr. Who's Dalek Empire. Essentially, it is the democracy/freedom vs. Imperial authoritarian tyranny - the same narrative pushed in the cold war for the West.

The term was more than likely vilified in the US as early as the time of the American War of Independence, so it wasn't exactly a stretch to start assigning the term "Empire" as a label of an evil nation during cold war media.

The term Commonwealth was initially applied only to the self-governing Dominions, from 1926 I think, and as an official title.
Initially applied to the 17th century Commonwealth, the Republic established by Oliver Cromwell until the Restoration.
 
Last edited:
Mar 2018
522
UK
#13
There was a large assortment of Mughal/Indian princes that were vassals of Britain’s during their ownership of India, and in many cases retained a lot of power in their own local regions.
That's true, but I don't think that the Westerns considered them as "real kings", more like chieftains. I find it hard to imagine that anyone in London would have thought of a Mughal prince as being on the same level of nobility as the king of Spain or Sweden.
 
Sep 2012
3,606
Bulgaria
#14
Aha, I wondered why empire starts with 'E' while imperial is with 'I'. The word Empire came from Old French via Norman French / Latin 'I' evolved into Old French 'E' and finally into the Middle English. The words Imperial and Imperium as supreme power came directly from Latin. Funny fact, in Russian (and the rest of the slavic languages btw) it is Imperia, which is the plural of Imperium, but the Russian emperor is Imperator all right.
 

Futurist

Ad Honoris
May 2014
14,253
SoCal
#15
For man years, nations proudly declared themselves to be Empires: Russia, Britain, France, and nearly every major power within and without Europe took up the mantle of being an "Empire" at some point.

The question is, when did being an "Empire" become taboo? When you say the word "Empire" to your average layman, they probably think of Star Wars or a TV show about the recording industry. However, I would guess that Empire started becoming taboo during or soon after WWII or the Suez Crisis.
The aftermath of WWII seems like a good point for the beginning of such a view. After all, the idea that non-White people shouldn't be ruled over by Whites without their consent really gained popularity in the aftermath of World War II.
 

pugsville

Ad Honorem
Oct 2010
8,334
#16
The aftermath of WWII seems like a good point for the beginning of such a view. After all, the idea that non-White people shouldn't be ruled over by Whites without their consent really gained popularity in the aftermath of World War II.
I would go back further ww1, The requirement of lareg conscript armies in ww1, lead to a questioning of what these large sacrifaces were for, wider democracy, wider rights for all citizens, and widespread questioning and horror of war. The mobilization of large numbers of men, lead to promises of reform, a btter world, fighting for democracy. After ww1 the pressure from the masses for greater regard for the masses , more democracy and responbile givernment was an established large factor.

The benefits of empire were often restricted to a small class. Once there was widespread democracy in an ear of mass armies there would be a reluctatnce to pay ;areg costs to support empire.

I would argue that these changes susbatinally under way in the aftermath of ww1 even if it took large amount of time to work through.
 

Futurist

Ad Honoris
May 2014
14,253
SoCal
#17
I would go back further ww1, The requirement of lareg conscript armies in ww1, lead to a questioning of what these large sacrifaces were for, wider democracy, wider rights for all citizens, and widespread questioning and horror of war. The mobilization of large numbers of men, lead to promises of reform, a btter world, fighting for democracy. After ww1 the pressure from the masses for greater regard for the masses , more democracy and responbile givernment was an established large factor.

The benefits of empire were often restricted to a small class. Once there was widespread democracy in an ear of mass armies there would be a reluctatnce to pay ;areg costs to support empire.

I would argue that these changes susbatinally under way in the aftermath of ww1 even if it took large amount of time to work through.
That makes sense. Still, I'm not sure that my own point here is completely invalid. After all, even if one doesn't see much value in empire for oneself, there was the idea of the White Man's Burden in the late 19th and early 20th century where it was believed that it was the job of Whites to civilize and educate non-Whites. This idea might have only went away after the end of World War II--and it also matches with the fact that the end of World War II is when African-Americans in the U.S. began being more assertive in regards to acquiring their own rights as well.
 
Mar 2016
563
Australia
#18
That makes sense. Still, I'm not sure that my own point here is completely invalid. After all, even if one doesn't see much value in empire for oneself, there was the idea of the White Man's Burden in the late 19th and early 20th century where it was believed that it was the job of Whites to civilize and educate non-Whites. This idea might have only went away after the end of World War II--and it also matches with the fact that the end of World War II is when African-Americans in the U.S. began being more assertive in regards to acquiring their own rights as well.
Britain gave India Dominion status in the early 1930s after widespread protests and demands for more autonomy and representation, specifically because so many of them had fought in WW1. Also, the situation with African-Americans was fairly irrelevant for the European colonial empires, because unlike in the US they didn't have a large proportion of non-whites living in Britain or France proper, just their colonies.
 
Likes: Futurist

Futurist

Ad Honoris
May 2014
14,253
SoCal
#19
Britain gave India Dominion status in the early 1930s after widespread protests and demands for more autonomy and representation, specifically because so many of them had fought in WW1. Also, the situation with African-Americans was fairly irrelevant for the European colonial empires, because unlike in the US they didn't have a large proportion of non-whites living in Britain or France proper, just their colonies.
The part about India is certainly interesting. That said, though, was full independence for other British colonies besides India also on the agenda even before WWII?

Also, your last point here is certainly correct, but my point here is that the anti-colonial movement in much of the world and the Civil Rights Movement in the U.S. were both caused by a belief in the dignity of non-White people--a belief that probably significantly increased in popularity as a result of World War II.
 

pugsville

Ad Honorem
Oct 2010
8,334
#20
That makes sense. Still, I'm not sure that my own point here is completely invalid. After all, even if one doesn't see much value in empire for oneself, there was the idea of the White Man's Burden in the late 19th and early 20th century where it was believed that it was the job of Whites to civilize and educate non-Whites. This idea might have only went away after the end of World War II--and it also matches with the fact that the end of World War II is when African-Americans in the U.S. began being more assertive in regards to acquiring their own rights as well.
Yeah it's where do you draw the line when forces start working or when the results appear sort of thing. I more on the side once the forcres were unleashed (mass mobilization, mass propganda of teh first worls war, once you enegaged the masses in politics (teh reuslt of mass porprganda for mass mobilization) , and promised the masses vagee stuff ( homes for heroes etc, once a mass army was mbolizsed and hoorodeuous costs are being paid, promises were needed) )
 
Likes: Futurist

Similar History Discussions