Perceptions of Imperialism and Empires

Feb 2019
Created so it doesn't derail existing threads.

In relatively recent history Imperialism and Empire building were seen as something good. Calling a country an empire would be seen as something positive and even an indicator of a certain level of power and prestige. However during the Cold War the terms have acquired a much more negative meaning. Now calling a country an empire is seen as an insult or a negative remark. What caused this shift? Why did the terms gain a negative meaning and when exactly did this shift occur? Furthermore, should the terms have a negative or a positive meaning today.