Why one should understand the definition of feminism and why we don’t need it in American society
Lately, there has been a with the topic of feminism when it comes to how women are treated in society compared to how men are treated. Feminism is the advocacy of women’s rights socially, politically, and economically. In the United States, there is spectrum of how much people believe in feminism. I believe in equal rights for all and I think that everyone should understand what a feminist really is, but I do not believe that feminism is necessary in order for women to be treated equally.
Feminism is the belief that all women are entitled to the same rights as men. Being a feminist does not mean that you have to be a girl, which is a common
…show more content…
I told him that I was writing about feminism, and he responded “man, I hate feminists. Those chicks are crazy.”
People tend to put feminism and feminazis together. The term “feminism” has been redefined by society as women who strive to be treated better than men are.
I understand why people do not like feminazis. The term, ‘feminazi’ is used to define a radical feminist, who seek superiority over men, rather than equality. There is a problem because there are many people who do not understand what a feminist really is. The common belief is that all feminists are feminazis who want to gain superiority over men. Because many people think of feminists this way, most people claim that they hate feminists and that feminism is stupid. If people fully understood that feminism is the belief in gender equality, I am sure that people would stop saying that they hate feminism.
“Women get paid 77% of the dollar that men earn,” is something that is often heard when feminism is brought up. I do not think that feminism is necessary in order to maintain gender equality. The wage gap that many feminists claim is real is an old statistic, and the way women and men are treated in society is fairly