I 100% don't believe the patriarchy is truly a thing that exists. It's nothing but an imaginary boogeyman that misandrists blame their failures and shortcomings in life on rather than accept accounability and also to blame when they don't receive favorable treatment. It's a means of demonizing men as a whole and it also brainwashes impressionable young women into thinking it's a real thing out to get them and men are their sworn enemy. That some parts of the world and third-world nations have brutal authoritarian regimes that make life a living Hell for women and men alike exist, absolutely. But a patriarchy? As in an organization of evil men in power conspiring to keep women from succeeding and is behind all of the evils in the world? An organization so evil it's not only opposed to women but men as well when misandrists will fall back on the "patriarchy hurts men, too" argument? No way.

I'm sick and tired of seeing and hearing anything about "patriarchy" mentioned anywhere, and it's doubly annoying when it creeps into entertainment. I almost groaned out loud in the theater seeing Exorcist: Believer (which wasn't a particularly good film FYI, you're not missing out) and one of the characters mentions it. It's honestly disturbing so many adults genuinely believe it to be something that exists. I think anytime anyone evokes it, they cease being a credible person to take seriously about anything. The "patriarchy" is no different from the ZOG for white supremacists or "deep state" for hardcore right-wingers, a collective delusion and imaginary enemy to place blame on. And it's all the more irritating as a most left-leaning person with most of my views when you've got those who assume belief in a patriarchy is a left-wing position, when I find it asinine.