It's something I've seen mostly with women, but some guys do it too - the absolute refusal to admit that "my side" can have it better in some regard. Always painting themselves as victims in hope of gaining the moral high ground.

I don't think we ever can agree to anything, if both sexes are dead-set on claiming that they're 100% innocent and taken advantage of, while the other side is 100% cruel oppressors who hold the power in every aspect of life (e.g. the idea of "patriarchy" vs. "gynocentric society").

So, is anyone here willing to admit that their sex actually is enjoying some privilege that the other side does not have?