ipl-logo

How Did World War 1 Changed American Society

573 Words3 Pages

In World War 1 a lot changed for the United States. One things that changed was their foreign policy. We know it changed because they went from a period of isolationism to being involved in world affairs. We are going to look at how the war changed American society, why they entered the war, and the foreign policy change.
During World War 1 a lot changed about American society. Some things that changed were that women had gained the right to vote, women held more jobs, and the great migration.
In 1919 women got the right to vote, because of the ¾ vote from states, women felt they had more of a say in society due to men being at war. The amendment said that the right to vote shall not be denied on the account of sex. During America’s time in WW1 …show more content…

George Washington encouraged the United States to take a neutral approach, to avoid wars with nations in the future. Woodrow
Wilson wanted to continue the policy of neutrality. He eventually asked Congress to declare war on Germany. The Government failed to sign the Treaty of Versailles and join the League of
Nations. Many thought that joining the League of Nations would lead to war. The United States continued a policy of isolationism up until World War 2.
In conclusion, World War 1 changed American society, and foreign policy. American society changed as so women gained the right to vote, women gained more jobs. One thing that happened during the war was the Great Migration, which was when over 6 million AfricanAmericans moved north. The United States didn’t enter the war until 1917 because of their policy of isolationism, but they entered because Germany sunk a British ship that had 128
American passengers on board, Germany sent Mexico a telegram trying to form an alliance, and
America had loaned the allied powers lots of money and didn’t want to lose it if they lost. The
United States also changed their foreign policy from isolationism to involved in world

Open Document