Throughout history women have always played a vital role in the prosperity and growth of their community. It is important to understand just how essential women’s roles were in the expansion of America. Women’s roles varied depending on the era, their race, and their culture. Native American women were very important in their communities, their duties expanded much farther than just being mother’s to the tribe’s children. Native American women belonged to a culture that gave them respect and power. Traditionally Native American culture allowed women to have a sense of autonomy and equality. Native American women’s work was very much centered around the home, they were in charge of gathering materials and building homes for the tribe. Women were respected in their communities, they provided a feeling of strength and consistency in their tribes. However, due to colonization, women’s roles and lives changed throughout the …show more content…
By the 1630’s disease and epidemics pervaded most New England tribes. During this time period Europeans brought many different technologies and lifestyles with them. Thus, the effect of such exchanges was the arrival and spread of many diseases. Native Americans had not built immunities to the new diseases nor did they form any medicines to combat them. These epidemics greatly impacted the lives of Native American women because many fell victims to the diseases or their families were broken due to the casualties. Native American women and their tribes lost many loved ones because of the plagues that infested their lives. Other changes that occurred in the lives of native women were intermarriages with Europeans. Along with the everyday contact with Europeans, these intermarriages resulted in many natives adopting to European ways of life. The lives of native were influenced by European culture, they occupied a unique position between two