Society doesn't fail to drill it into all of our heads that women have it so bad in the modern world. We're constantly "educated" about things like body positivity and fat acceptance because girls are so depressed and insecure about their looks that it's a national tragedy, we get told how practically every other woman on a college campus is being sexually assaulted (which is bullshit feminist propaganda anyway), we get told how women are being discriminated against in the workplace and denied promotions/opportunities becuase of their sex (never mind the fact that women make up the majority of college graduates nowadays), we get told how women feel threatend and objectified by men becuase men have the audacity to be attracted to them and hit on them, we get told how women apparently can't even walk down the street without being catcalled left and right (next time you want to conduct a YouTube 'experiment' to prove that point try using a fat/homely looking female without a big ass and titties and lets see if she'll get the same reactions), we get told how young women and girls are going through this and that and this and that and how the world just needs to feel sorry for them because ladies deserve it....

Ok well, are we just going to ingore the entire other half of the population and thier issues/needs? Why doesn't anyone want to talk about the fact that men also have their own insecurities? Their own body image issues? That mock and ridicule of small penises is socially sanctioned? That guys who want to improve their image are told that they're being insecure and childish? Why not talk abot guys who want to better understand women in order to have more sexual sucess are told they're being "entitled" and misogynistic? Or that guys are expected to be stoic and not let their hardships show less they lose the respect of the people around them including sexual partners? Why not talk about the fact that men also suffer from domestic violence and rape? That men are biggest demographic of unemployment and homelessness? And that they still must remain stoic through those experiences? Why doesn't anyone care that men are people with human needs, and that prolonged periods of bullying, sexual rejection, and loneliness produces nothing good for the mental and emotional health of these men? That men even kill themselves over these issues and not being able to deal with it?

Why doesn't anyone give a fuck about men?? why don't they care about the pain men go through? Why are women so fucking special that every little grievance they have with the world is treated like a national emergency that must be addressed immediately (even stupid things like 'manspreading' which ironically ingores the fact that women often do dumb shit like leave their purse/shopping bags in an empty seat thus producing the same effect), meanwhile guys are blowing their heads off left and right because they're in pain and no one gives a shit - "man up pussy", "quit crying", "the world doesn't owe you shit you entitled prick".

Society has made it clear that women have problems and they need help. We fucking get the point. But guess what though, ladies? It's not exactly a picnic for us men either.

Until society starts caring more about men and thier problems, expect more of a sexist backlash from men who feel they're being ingored/disenfranchised.