Different forms of sex work are rising dramatically in society, and sex work is being pushed heavily in the media. The number one song in the world right now is by a former stripper/whore glorifying whoring. It seems there is a cultural trend being pushed to make whoring acceptable. And on the ground where I live, it seems like this trend is taking root. Online dating is filled with women selling themselves. For one of my uglier friends, almost all his matches propose that the poor guy pays for sex. Some clubs, restaurant and hotels (pre covid) had become well known for being places where women get to meet with prospective clients or sugar daddies or sex tourists. And we can all see the boom in virtual stripping services and the "Instagram look" which is usually a tell-tale sign that a woman is selling her body in my experience.

With this trend, I think it might reach a point where a significant chunk of women are indulging in selling their bodies. I already feel like a significant chunk of 8s and 9s are in the business. How do you think cultural attitudes are gonna shift towards women who sell their bodies? Break down the attitudes of women vs men if you can. Or do you think they'll not change at all from the puritan thinking of today? Or do you think the attitude has already changed?