Was reading an article that’s says studies show that when men go to college they begin to shift right politically while women remain in the left. Why do you think that is and what role do you think the college environment, dating and media play in this?

It also says men are less likely to enroll in college than women so does this have an effect?

https://www.newsweek.com/women-college-are-moving-left-men-are-migrating-right-opinion-1742192?amp=1