I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?
I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.
Edit: thanks for the responses everybody.
I can’t answer for your societal questions since I don’t really care about them. But I do it from time to time only for my self esteem.
I smoked for years and became self conscious about my teeth becoming yellow-ish. So I don’t smoke anymore and do a whitening maybe once a year at max.
It’s a bit like sport, I feel better in my body so I am more happy, social and smile/laugh more.