Aside from men being horny in the 1800s, and this being a holdover since that time, is there any actual reason why this hasn’t changed?
If society was dominated by women, would this be more likely to change?
I was sweating my ass off hiking in the hot sun, and the question has been bothering me all day after my top soaked through with sweat.
I live in Ontario, Canada, where it’s perfectly legal for women to be topless.
I’ve never actually seen anyone exercise that right (at least in person), but it’s a right.
Legally it’s allowed but socially it isn’t. We can’t go topless without being endlessly harassed and probably assaulted. It was bad enough when I walked home with my bikini top after going swimming.
I can’t argue that one bit.
I’m sorry men are so stupid. (I’m presuming it’s men)
Women would absolutely be judgmental about it but it’d definitely be men doing the harassing and assaulting.