More women are feminists than we realize; it's just that the word's been given such a negative connotation that people hear it and think of ugly, hairy, men hating women who can't get laid, which is the furthest thing from the truth. I am a proud feminist, and it's not because I think women are better; I simply want to be recognized as an equal. I don't think wanting to be paid the same amount as my counterparts, pushing for better childcare (and I don't have kids!) and disliking the horror known as "rape culture" is a bad thing, but maybe that's just me. I also despise the roles that men are pressured into (they have their disadvantages too). I mean, we're just all people, ya know?