I don't know if anyone else has noticed this. Every country (or region within a country) that is more developed and richer, tends to revert to feminism. The richer and more developed somewhere is, the more strongly feminist it seems to be. The USA is richest and most developed country in the world and it seems to be the most feminist country as well. Rich countries in Europe such as UK, Germany, Sweden are all quite feminist. Poorer southern countries such as Greece, Italy and Turkey are much less so.

Countries that are becoming more developed, such as Brazil, are becoming more feminist. Apparently big urban areas in Brazil are now developing a strong feminist presence. Also rural areas within countries (which tend to be poorer than cities) are less feminist than urban areas.

It's like as a country or region gets richer, it decides it wants to hate men for whatever reason. Has anyone noticed this and why do you think this is?