
Is my dentist scamming me?
Am I Normal? with Mona Chalabi
00:00
Why Do Dentists Tell You to Get Your Teeth Whitened?
There's a power dynamic involved when a health care professional tells you that you should get your teeth whitened, says Dill. "We know that dentists are known arbitrators, of beauty," writes Dill. Some patients show up to a dentist asking for a foll set of aners, even though that involves grinding down the front enamel of some one's teeth.
Transcript
Play full episode