After the Allies defeated Germany and ended World War II in Europe in summer 1945, peace finally prevailed on the Continent. The Germans were beaten, desperate and ashamed, but also ready to get back to normal life following the trauma. By DER SPIEGEL Staff
from DER SPIEGEL - International https://ift.tt/2yzgvhJ
via IFTTT
No comments:
Post a Comment