Quite a few posts here I've seen women state that men either need to level up and go to college or that men who don't have a college education are lazy and broke. I have some theories on this but I was more so curious why it's so important? Because there are plenty of careers out there that pay amazing and require no college education.