Just about everyone looks better when they smile. It’s true regardless of gender. I don’t see where sexism enters the equation.

I feel pretty oblivious. What am I missing?

  • TheBananaKing@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    It’s not women’s job to be attractive.

    They aren’t there for your viewing pleasure.

    They’re not for you; they’re not a public amenity.

    You have no more right to expect them to smile in order to make your surroundings more aesthetic than you have a right to expect them to get their tits out for you to gawp at.