My reason for starting this post is my lack of education. I missed a lot of school as a kid so when I noticed a gap in my own knowledge I like to bring it up and discussion and learn some things.

When feminists were fighting for the right to vote and equal treatment in the workplace and all of those things, I'm guessing once those things were achieved that's when feminism was no longer needed.

I think we all agree that feminism was needed at a certain point, but then it just turned into whining and bitching about wanting extra privileges and special treatment.

Help me understand the point that it was no longer needed so that I can talk about it better with other people