I've read a lot of books about WWII in the past year, spanning from the end of WWI, the economic troubles, the war itself, and the aftermath. I unfortunately do not know anyone who has any interest in history, so I have no way of bouncing my thoughts off other people. The more I've read about the third reich, the more it seems to me that they tried, but failed to do what America did with its westward expansion and killing of the indians. So, if Germany had won they would have emerged with dominion over Europe as a superpower not so much unlike the United States. We are taught that WWII Germany is evil, but how were their aims different than that of the USA? Who are we to call them evil because we got away with it?