The war had major effects on a lot of different things. One of the biggest changes that happened after the war had started would have to be the opportunities presented to the women and African American communities. Both of these groups were not given the same opportunities before the war. After the war started they got the change that they were looking for, but continued to face limitations.
Women have always played an important role in the way of life. Before the war broke out women were restricted to housewife jobs such as cooking and cleaning. Not every woman wants to do housewife duties. I believe that people started to realize that not everybody is cut out for a certain job description. After the war started and men were shipped overseas, women were finally able to do more productive jobs like factory or office jobs. Women were even given the chance to become nurses for the Red Cross Association. “Women’s service in the Red Cross in World War One required them to drive cars and be mechanics in the US, but it also sent them to the edges of the front lines in Europe. Their service made it obvious to the US how important women were. While nursing was not a new profession for women, nurses’ importance grew.” Women were finally being
…show more content…
Even though they were valued low on American soil, they were mighty to step up and play their part in the war. More than 380,000 African American served in the Army during World War I. Those serving in the Army were given important jobs serving as laborers, building roads, bridges and trenches. The African Americans that didn’t serve in the war were quick to join the higher paying jobs in the steel, mining, ship building and car manufacturing industries. There was a higher demand for supplies, mostly overseas, that needed to be filled. African Americans stepped up and helped to keep the American economy floating while the Army was fighting to keep peace across the