Throughout the coming out process I was very hesitant about calling myself a lesbian, I always preferred the term gay. One huge reason was because of the stereotypes associated with lesbians suck as the angry feminist.
I’m currently taking a woman’s studies class at one of my community colleges and we just finished reading a section about what feminism is, the different waves of it, and some other information. It was really interesting. I found out according to the definition of feminism, I am indeed a feminist as I bet many people are. The definition goes something like this. Believing in the economic, political, and social equality between the sexes. Sorry I don’t have my book next to me so it may be a little off. While doing a study they found most women agreed with the definition of feminism but few considered themselves feminists.
How did we get such a bad rap? Man hating. Bra burning. Angry women, with a chip on their shoulder. Why are so many people like me, afraid to be associated with the term feminist. Why do we associate lesbians with feminists? Why aren’t people aware there are/were male feminists as well? Why don’t people realize we don’t want a matriarchy we just want equality.
So I guess as I have accepted my sexuality I need to come to terms with my feminist beliefs as well and educate others as to what that means and debunk the stereotypes.