Throughout history women have struggled to gain their rights. Ever since Adam and Eve, Eve was always the one making mistakes. American history regarding women is a shameful one, for women have been suppressed since the beginning. In the 1600s puritans traveled to america to gain freedom, but they suppressed their women as much as possible. Women were publicly hanged and accused of being witches in 1692, and this was seen as okay. Violence against women was greatly accepted and “housewife” was the only job available. Women were trained to rely heavily on their husbands and were not allowed to make any money for themselves. This culture of misogyny led women to begin marrying as a survival tactic. Eventually, women were able to work and often got jobs as nurses or seamstresses during wartime. Regardless of their position, women always were paid less than men, and this is still a common theme of today. Additionally public education was almost completely inaccessible to women until a women’s department of education was added in the 1770s. Approaching the 1850s the women’s rights …show more content…
I between 1907 and 1922 they achieved most of their goals such as laws regarding minimum wage and child labor. This association helped to get women in the workforce and allowed them to receive a, somewhat, fair wage for their work. Finally in 1920 the federal women’s suffrage amendment, written in 1878, was sent to the white house for ratification. This amendment allowed women to vote, and finally be a part of our government. Throughout the 1900s women's rights were gained one by one. Some of these rights were gained by laws such as the Equal Rights Amendment, the Equal Credit Opportunity Act, and the Pregnancy Discrimination Act. Women have been oppressed since the beginning of time and, although gender equality is on the rise, are still being subjugated