
Is my dentist scamming me?
Am I Normal? with Mona Chalabi
Why Do Dentists Tell You to Get Your Teeth Whitened?
There's a power dynamic involved when a health care professional tells you that you should get your teeth whitened, says Dill. "We know that dentists are known arbitrators, of beauty," writes Dill. Some patients show up to a dentist asking for a foll set of aners, even though that involves grinding down the front enamel of some one's teeth.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.