What was Africa really like before and after colonialism and what change did the Colonial Era bring. Colonialism is the establishment, maintenance, acquisition and expansion of colonies in one territory by people from another territory. Africa undergoes many of the changes that we see in the present day, and determine if colonialism and the momentum of European countries really did lead to the downfall of Africa as well as figuring out if Africa should be compensated for the destruction of their country. The importance of figuring out this question is because Africa, often overlooked, is a huge part of our world. Not only is it the second largest continent but we need to understand the past to figure out how Africa became what it is today. …show more content…
I find it to all lead to the main factor of economic reasons. European countries had developed through the exploitation of African countries through unfair trade and use of natural resources. The European countries made sure that people only produce one crop and sold it to them. Ghana produces cocoa, Kenya produces tea. European countries had gotten raw materials from Africa at incredibly cheap prices and sold it back to Africa for more expensive prices. For example, Ghana sells cocoa to Europe for very little, then Europe sells Ghana’s cocoa to Kenya for much more money. Europeans would be getting richer as Africa would continually losing power in their trade system and slowly their market would collapse. Inequality did factor into the colonization of Africa in every sense of racism, gender discrimination but mainly in the sense of class in other words economic standing. Firstly the categorization of various races based on physical attributes were prevalent in Europe. Many Europeans regarded themselves as the supreme civilization in the word, and some saw it as their mission to “enlighten” and “civilize” people such as Africa. It is commonly mistaken that Europeans enslaved Africans for racist reasons but they were enslaved for mainly economic motives. The profit in slave labor is immense and European countries understood this well before the mark of the colonial period. Does religion have an influence …show more content…
Pre-colonial times African the status of women was filled with a lot of independence and contributing to society politically and economically. They had owned their own land and with the land they acquired power. When colonial rule came they looked for male leaders to govern. African women attempted to contest the idea that women are unable to have worthy roles in society. With the European influence of male superiority the former notion of women having influence was quickly changed. The changes occurred firstly by women losing access and control of their land causing them to be more economically dependent on men. They had also lost their political roles. What often occurred would be colonies with elders determined to undermine women’s power, which provoked uprisings like the “Women’s War” in Igboloand in 1929. The colonial powers did assist in bringing infrastructure things such as harbors, railroads, hospitals, and schools, although the schools were mainly only available to richer western families. This eventually brought forth industrialization. Do these positive effects outweigh everything else? “All they did was bring in technology and war. Not only took the minerals but they took artifacts and the Italians took a statue from Ethiopia where I grew up. The Italians were in Ethiopia for 5 years until we fought back and won.” – Samuel Adera. Samuel is my roommate who lived in Ethiopia for 18