The Dig cover image

AI Hype Machine w/ Meredith Whittaker, Ed Ongweso, and Sarah West

The Dig

The Flaws and Harm of Emotion Recognition Systems in Facial Recognition Technology

3min Snip

00:00
Play full episode
Computer vision systems that measure facial muscle movements and impute emotions are pseudoscience, but people still believe in them. Businesses develop these systems to make money off facial recognition technology. Amazon's recognition API, for example, added fear recognition to its features, deepening the flaws of the original intended use case. This widespread deployment of emotion recognition systems can cause significant harm, especially to communities of color. State authorities should not rely on these systems to inform their actions or decisions because they are built on low-quality labeled data and repackaged race science and physiognomy. The flaws in these systems can lead to errors at different levels and negatively impact people's lives.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode