Someone explain feminism without being judgmental.
Go.
Not even being sarcastic.
Please I want to know what the fuss is about
EDIT: GameStop and Walmart are popular places
-
I feel I receive more bashing for being a ginger than a woman. Feminists are usually over the top. Sure we want equality but they seem to want men to be beneath them. Constantly bashing men is not the correct way to receive equal treatment. Act as an equal and people will eventually treat u as such. That's at least always worked for me.