Women keep telling each other how much freedom they have and how they are strong and independent because they can make their own money and don't depend on anyone. If I were to say the same thing about being free and happy people would attack me with shaming language and tell me I'm just rationalizing my inability to attract woman.

Even in the news there are articles about women in Japan and Korea being "liberated" because they don't need a man, but when the men avoid marriage they are called "herbivore men" and "manchildren". Why does the media do this? I shouldn't let the media bother me but it is starting to annoy me.