If you browse through feminist subs on Reddit, you'll see this common complaint. Of men telling women " You should smile more " and then women getting offended at that.

Why is this something to be offended by? I was once in a club and some girl who passed me had leaned over (the music was loud) and told me to smile. And you know what I did? I smiled. I did not get offended at the suggestion. When someone tells you that, they only mean well.

The fact that they get offended by this really says a lot. To think that they consider this as a valid insult. It makes you wonder if they know what oppression really is.