When one thinks of the health care system, words that usually come to mind are safety, protection, quality care and the like. We live in a very progressive, very industrialized country as Americans. We have made many technological advances in our sciences. In other words, we are a developed country that provides many opportunities to those who reside here. The foundation of our country is based off of equality, fairness, and justice. However, the question that has often been brought up in history is who do these qualities apply to? The Declaration of Independence is one of the early documents that outline the principles of our country. It states, “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness” (The Declaration). The key phrase in this statement is “all men,” which include women under this bracket as well. The United States of America prides itself in being a “melting …show more content…
The Women’s Liberation group made a statement saying,
“In spite of the fact that it is women who are taking the pill and taking the risks, it was legislators, the doctors, and the drug company’s representatives, all men of course, who were testifying and dissecting women as if they were no more important than the laboratory animals they work with every day (Vargas).”
These men made the decision to test on these women, and they still made the decisions concerning the repercussions of the study as well. The demeaning manor in which this study was discussed is no foreign concept for women throughout history. For many years, men have held power over many things, especially concerning women of color and their health. In their minds, they rationalize that they are “helping” these women, but at what cost to