It seems like women have come to terms with reality much more so than men, they understand status, money, power, control, sex, love, societal rules. They don't deny any of it. Men on the other hand try to pretend much more so, will act like a girl is beautiful when she is not. Will lie and lie and not face reality about a lot of things and situations in life. What causes this dynamic? Also, the media and a lot of interactions between the two seem like the polar opposite, at least on the face of it, then the further down you dig the truth reveals itself.