Gender Roles in the 1950’s
The exact definition of sexism is prejudice or discrimination based on sex - especially discrimination towards women. In modern day America, we struggle with this issue every day whether it comes to the workplace, social settings, or even family life. In the 1950’s, this issue plagued our country dramatically and left an intense aftermath. Although progress made, gender roles in post WWII has made a lasting impact on American society.
As stated, post World War II has resulted in many changes in the American society. When the war was happening, women filled roles that were mostly occupied by men. They took all the factory jobs and became the real workers in the community. Once the war ended, men came home from being deployed and drove all the women to unemployment. Even when women were successfully pursuing these careers, they were removed from work and intensely discriminated against. This act gave women the only choice of staying home and taking care of the kids and the household. This post WWII effect caused a harsh barrier between genders and what their roles in society “should” be.
Men played the more dominant roles in society. In the family aspect, they were known as the suppliers of the family, expecting to come home from work with a home
…show more content…
Post World War Two America made an everlasting wound on the gender gap of our society and that we will spend years trying to bandage. This time period of women being submissive and men having all the dominance or power is in the process of being behind us, but not completely. The way we portrayed our gender roles in the workplace, social settings, or even family life has made a dramatic impact on how we portray them in the 21st century. Without the War’s change on society’s view of genders, America would not be where it is at