Someone explain feminism without being judgmental.
Go.
Not even being sarcastic.
Please I want to know what the fuss is about
EDIT: GameStop and Walmart are popular places
-
It's okay to be sexist, as long as it's not towards women.