Wwii Effects On American Society

648 Words3 Pages
Although it is the WWII that a lot of media shows focus on, we can see that lots of historians claim WWI has diverse sides and has much more to learn from because it is complex. In this base, it would also be interesting to see what effects of WWI affected the American Public by changing American Public’s perspectives on each other and on their beliefs. Looking at the events of WWI, it is clear that deaths of the male during WWI helped to rise the status of women, and how women are viewed in American society. The WWI is a horrible war because lots of men who volunteer as soldier did not know the terrible death they would meet. Because Europe was pretty peaceful since the Napoleon Wars, people during this time did not exactly realized how