The Vietnam war was a fucking disaster. There hasnt been a justified war in America since WW2. Why do guys here want more people to die in rich men's wars? In my "evil feminist" world, men live. In this "red pilled world" men die because rich men find them expendable. There were plenty of women who hated the Vietnam war, but for some reason we dont talk about them. I feel like men on here really just dont care about fixing male issues.