Monday, February 11, 2008

The Treaty of Versailles (Ends WWI)


After WWI, Germany was blamed for the whole war. The war cost them nearly 10 percent of their land. Even their overseas colonies, were taken away by allies, who blamed Germany for their losses. German's economy was ruined along with their products and produce. After the War the German people, were left with nothing but their bitter feelings.
So what was the treaty of Versailles? It was an official peace treaty which officially ended the war. It was signed exactly 5 years after the assassination of Archduke Franz Ferdinand,which was the same reason why the war started in the first place. Germany had been the second most powerful industrial nation in the world, after the United States. After WWI, Germany was lead to a dead end and was forced to accept all the terms the allies provided. It was a powerful humiliating defeat for Germany. All they were left with, was their guilt for starting the War, something they had to accept. Even though it wasn't fair. Later on Germany was again blamed for the causes of WW2. Many other countries used the excuse of the war to gain or take away power from Germany, who was one of the biggest powers in the world, at the time. After the war, many new countries formed such as Poland, who's land lies on what used to be Germany before the war. WWI and the treaty of Versailles lead to the great Depression in Germany. Many people lost their jobs and starved. The panic and fear of the people, led to the uprising of Hitler.

No comments: