Ep. 7: How Humans Bias AI - Narrative Science Chief Scientist Kris Hammond
Jan 22, 2017
auto_awesome
Kris Hammond, Chief Scientist at Narrative Science, discusses biases in AI systems, highlighting how human input influences decision-making. The podcast explores biases in news consumption, integrating antonyms in search results for critical thinking, unintentional reinforcement of biases in AI, and creating empathetic AI assistance for personalized experiences.
AI reflects human biases from data training, leading to biased outcomes.
Human interactions with AI can introduce biases, emphasizing the need for human oversight in machine learning processes.
Deep dives
Bias in Smart Machines
Smart machines are often seen as unbiased and objective, operating on data in a logical and precise manner. However, bias creeps into these systems due to the data they are trained on, reflecting the biases of the individuals collecting and selecting the data. An example highlighted is a facial recognition system trained by a team that biased it towards recognizing their idea of beauty, reflecting their limited perspective.
Bias Through Interaction: The Case of Microsoft's TAY
Bias can be introduced through interactions with learning systems like Microsoft's TAY, which learned objectionable content due to users deliberately feeding it derogatory comments. This highlights the impact of intentional interactions on shaping machine behavior and raises the importance of human oversight in correcting and guiding machine learning processes.
Information Bias and Reinforcement
Information bias stemming from similarity algorithms and the reinforcement of stereotypes in job recommendations underscore how technology reflects and amplifies human biases. The issue extends to online echo chambers and the reinforcing of confirmation biases where individuals are served content similar to their existing beliefs, emphasizing the need for technology that presents diverse viewpoints to counteract stereotypes and narrow perspectives.
It’s easy to think of AI as cold, unbiased, objective. Not quite, suggests Narrative Science Chief Scientist Kris Hammond explains, because we never know when AI will repeat our biases back to us.
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode