I mean history is written by the victors and never kind to the vanquished, but I think it's the whole "production line extermination of peoples based on racial superiority" that really made the aftermath of WWII particularly bad for Germany. I mean sure, ethnic cleansing had happened before (and has happened since, former Yugoslavia for example) but not on quite such a
scale and not really since the middle ages in terms of clearing out whole territiories for the resetlement of your own people. I think after WWI while there were still bad feelings between the winning and losing sides, nothing quite so horrendous happened to civilian populations. In the
immediate aftermath of WWII, the reasons for the horror and the soul-searching were entirely justified. I just don't think people who were in no way responsible should continue to live with that guilt.
I think some of it lies in the fact that if it weren't for Hitler Germany would be worse off. It's a big topic & not really relevant here but even after ww2 Germany had better terms than ww1 which would have made Germany almost a modern vassal state or third word country.
Very debatable. As Luna says, not only did Germany lose a massive chunk of its territory (areas which had historically been German for hundreds of years, from Poland to the Baltic States to Czechoslovakia) but entire cities were wiped off the map and millions of German civilians died in bombing, were the subject of war crimes from the invading Soviet troops and were demilitarised and militarily occupied by foreign armies, something which continues to this day. Kind of the exact opposite of what Hitler intended.
Also WWII of course enabled the US to become the dominant world power since Britain was totally drained of resources, again not something I think the Nazis would have really liked to see, since while they may have been grudgingly at war with Britain the idea of US supremacy was something they absolutely hated, as the US was viewed by the Nazis as a cultureless nation of slaves to (in their opinion, Jewish) finance capital (on the subject of which is worse, today's US economic imperialism vs. yesterday's British cultural imperialism, your mileage may vary).
The crippling war reparations Germany was expected to pay from WWI were a
massive mistake, something which helped wreck the German economy and created resentment which fuelled support for the Nazis, but Germany had already negotiated to stop paying those to everyone except the Americans before the Nazis even came to power. Most nations involved realized a external debt-crippled Germany (something which was fuelling a rise in nationalism) was going to be no good for anyone* - Indeed Britain supported that view and was in ultimately in favour of cancelling the debt altogether.
My personal view is that both Britain
and Germany would be considerably better off if the Nazis had not come to power and WWII had never occurred. Britain ostensibly "won" that war but both we and Germany lost our place as world powers and ended up under the thumb of the US (and half of Germany under the thumb of the USSR) - They were the only "winners" as they both came out of it more powerful.
*Funny how Germany doesn't seem to have learned this lesson when it comes to their relationship with Greece.