Assembly Required with Stacey Abrams

Do You See What I See? Building AI for All of Us

8 snips
Dec 26, 2024
Joy Buolamwini, an AI researcher and artist known for her impactful work on algorithmic bias and founder of the Algorithmic Justice League, joins the conversation. They dive deep into the dangers of bias in artificial intelligence, especially in facial recognition technologies. Joy highlights the need for equitable data to combat existing inequalities and discusses actionable steps individuals can take to engage with AI responsibly. They also explore global regulatory approaches and the crucial balance between innovation and privacy.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

The Coded Gaze

  • Joy Buolamwini's face wasn't detected by facial recognition software, but a white mask was.
  • This sparked her research into algorithmic bias, termed "the coded gaze."
INSIGHT

Power Shadows

  • Datasets used to train AI often reflect existing power imbalances.
  • This perpetuates bias in AI systems, like facial recognition misgendering women or people of color.
ANECDOTE

Gender Shades Study

  • Buolamwini's Gender Shades study revealed significant racial and gender bias in facial recognition software from major tech companies.
  • This led to public pressure and some companies improving their algorithms.
Get the Snipd Podcast app to discover more snips from this episode
Get the app