Western Victories in a Changing World

            The end of the Twentieth century, wars impacted the world and made it much of the way it is today. These wars shaped many new countries out of existing ones and brought to light the world's superpowers. The west was victorious in this time because of it's growing strength, allies, and power of negotiation through treaties.

             Population growth in the Western world proved to be a large reason in why they gained so much attention from the world. People from all over the world were trying to move to America because the culture was becoming so popular. They began to share ideas much like the Enlightenment did and made an impact on how the civilization progressed. They created international organizations like the Red Cross at the Geneva Convention, The Telegraphic Union, and the Postal Union to help unite the world, making thought more international.

             World War I proved the division in Europe by the two rival alliance systems. At this time most of the land of the world had already been claimed by different countries, but there were still disputes going on about who actually controlled parts of it. The war was fought on three different fronts, the East, West, and South sides of Germany. The Germans fought the British and French on the western front, on the east the Germans and Austria-Hungary fought the Russians, and on the south they fought they fought the Italians who had joined forces with the French and British. A rise in technology on the western front forced trench warfare and there was not much advancement in the battlegrounds.

             The United States got involved by loaning all of the front's governments money. They remained neutral but at the same time were very pro-British, but when the Germans attacked an American ship with torpedoes from submarines the opinions on the way quickly changed. They entered the war in 1917 and turned the tables on Germany with their strong military capabilities.

Related Essays: