icon caret-left icon caret-right instagram pinterest linkedin facebook twitter goodreads question-circle facebook circle twitter circle linkedin circle instagram circle goodreads circle pinterest circle

NauenThen

Men & rape

Slate has been debating: is it "victim blaming" to tell women to be careful, don't get drunk & all that? Shouldn't we, rather, address the rapists & a culture that tells (white) men they are more important than anyone else? (Does it have to be one or the other?)

If men rape because they feel entitled to, how much of that is because the rest of us (women, other men, the legal system, the corporate world, schools) are complicit? How many women automatically let their menfolk take the wheel, i.e., take control? I heard a woman reverently call herself "the mother of sons" in a tone that shouted that this was ludicrously better than having daughters.

And how is it that so many men claim that women run the show when they get deferred to, paid more, given first choice, and so on?
Be the first to comment