I was speaking with my friend who is studying gender studies in college, I studied Education but have always been interested in gender issues, hence why I'm here.

She told me that the professor (male 50s) of her human sexuality class claimed that men don't actually like women.

He claimed the way that our society socializes our young boys to "become men" is not so much to tell them how to be men, but rather how not to be women. all the classic stuff ya know, boys will be boys, don't be a pussy, be a man, man up, don't be sensitive and weak, caring even, that's girl stuff, you know the kind of "toxic" masculinity stuff we talk about a lot here.

He claimed that being a man is mostly defined by not being a woman , and the way that we socialize our young boys is to use the tool of shame to steer them away from these "feminine" things, he then argued that its nearly impossible to be told your whole life that the worst thing you could do is be like a woman, but then actually like and respect woman after that. Even if its subconscious, he claims there's a lot of inherent bias from this type of upbringing.

However the *hard* part is that sexual attraction and sexual attractiveness to woman is also a huge part of the socialization of men, and being able to "get" woman is obviously a status symbol to other men as well, hence the incredibly strange position of not inherently liking woman but also desiring them and sexually pursuing them

I don't know, this conversation really got me thinking about a lot of things related to these concepts,

like how men( and women themselves) often call a girl "one of the boys" to justify feelings of friendship, how many men struggle to come up with a female role model when asked, and even why men *seem* quicker to compliment physical attractiveness over personality.

I can tie this into violence against woman and why its so prevalent. I even wonder if it ties into more harmless things like the women aren't funny rhetoric that pisses me off so fucking much. Even makes me think about how a lot of times men are only really seeking approval from other men and don't truly care what women think , to me its why they're so confused about what women want.

To sum it up, How are men supposed to truly know and love women when they've been taught since birth that our supposed traits are "bad?"