19th Century American Urbanization Essay

746 Words3 Pages

The late nineteenth century brought an era of urbanization. The bright lights, job opportunities, and friends lured people from all over the nation and all over the world to cities like Manhattan, Chicago, and Philadelphia. The settlement of nearly millions of people in these cities, and the altered work-life caused dramatic changes to those living in these urban areas. Both immigrants and Americans faced unique challenges and changes in their lives as they settled into a more urban mode of life. Immigrants adapted to American life yet preserved their cultures through newspapers, societies, and schools. Americans saw a change in the roles of women—shifting from the role of housewives to taking control of their careers, decisions, and education. …show more content…

Advancing literacy, technology, medicine, and work-life was drastic during this period of urbanization and caused numerous social and familiar changes. The biggest social change in this time period was of that concerning women’s roles in society. Women’s roles began to drastically change during this time as well. For most of the early nineteenth century, women were seen as housewives, living in a separate and feminine sphere of life. They were told to be entertaining and take care of the children, when the men would spend long hours in work. As more jobs sprung up, women began to work as men did. They started to wear tailored suits “modeled after men’s shirts” (618), and stopped wearing corsets. Women also began to marry later on in life, or not marry at all—causing a sharp decline in fertility rates. Many did so to care for less children or to pursue their own careers. Finally, women began to pursue higher education, even though their education was not considered as important as that of men. They formed study circles to improve their general knowledge, and in the early twentieth century, began going to college. “By 1900, women made up around 40 percent of college students”(626). In short, the urbanization of the nation brought changes in women’s roles in society—they began to work, change their focus in life, and educated