Feminism in the United States
Feminism in the United States refers to a range of movements and ideologies aimed at defining, establishing, and defending a state of equal political, economic, cultural, and social rights for women in the United States.
|This theme article is a stub. You can help Wikiquote by expanding it.|
- Feminism in the United States has never emerged from the women who are most victimized by sexist oppression; women who are daily beaten down, mentally, physically, and spiritually-women who are powerless to change their condition in life. They are a silent majority. A mark of their victimization is that they accept their lot in life without visible question, without organized protest, without collective anger or rage.
- Women of color are mirrors in which white women are supposed to see themselves but, instead, see themselves as no other mirror can show them—as selves that are plural and who instead of being righteous, moral beings, are also participants and perpetuators of a racist system.
- Mariana Ortega, "Being Lovingly, Knowingly Ignorant: White Feminism and Women of Color" Hypatia, vol. 21, no. 3 (Summer 2006)