Women In American Society After Ww2

1008 Words5 Pages

Describe the changing role of women in American society after World War II.
Before World War II, the traditional role of the women in American of mainstream culture has been the wife and mother of the family. However, the role of women in American society after World War II has changed greatly.
1: The changing role from home to new jobs After the outbreak of World War II, a large number of male labor force in the United States the war are fighting on the front line, and then the labors of the first line were decreasing. Most women went out of their homes and put themselves into the production force. At the end of World War II, more and more men were enlisted labor; labor jobs for American women began to change. This is the background of …show more content…

Although heated with the increase in women 's employment households, they became more and more feel the unequal status of men and women, wishing to change their static diligent housework labor low role, demanding equal rights and chances with men. Famous women leader Betty was representative of the contemporary American women 's movement. She wrote a Book of servility secret, revealing the traditional American culture on women 's role imprisonment and definition and called on that the majority of families and women should break the family bound to seek self-value of life. This thought had awakened many American traditional family women, and they began to be not satisfied with the family to bring the sense of accomplishment, began to find their own identity and the right social role. In particular, a lot of knowledged women, due to a good education, had a stable income base. When facing with the unequal status of men and women, they began to think deeply, and tried to overthrow the traditional concept of women. The rapid rise of the United States after World War II laid the economic foundation for the second women 's movement in the United …show more content…

Women’s movements were also fighting for equal rights in the legal aspect and legislative organization. After arduous struggles, a few women obtained the right of political participation, increased the number of women members of the Congress of the United States, American women gradually became a powerful force, which marked the political changing role of women in American society after World War II. Improving the political status of women promoted the changing role of women in American society after World War II. As for women 's economic status, female vocational institutions happened great changes, many women were involved in a male dominated field of work , such as accountants, engineers, doctors, drivers, and so on. Women also demanded equal pay for equal work rights and benefits since the end of World War

More about Women In American Society After Ww2