I’m noticing a pattern of guys saying that they were lied to by society that being nice would be rewarding, but who told guys that the reward was vagina?

For example:

Boys are taught through media, school, etc etc to respect women, bring them flowers, be nice, chase, ignore them playing hard to get, etc etc etc and they will want to be with you and it is false. Morality doesn’t bring arousal.

Again, who told men that morals made women wet? Who told men that’s the sole reason to have morals?

Before someone says it’s about getting a girlfriend: You’re telling me you can’t find a good guy with a girlfriend to help you out? It’s just assholes having girlfriends that you’re around?

I truly can’t fathom this because I remember as a child I was told to be a good person because it’s the right thing to do (as a Christian). It was not so I can get fucked.