

Claudia Larcher: AI and the Art of Historical Reinterpretation
9 snips Apr 9, 2025
Claudia Larcher, an artist delving into AI bias and representation through diverse mediums, discusses how gender discrimination is embedded in AI models. She shares her innovative approach to injecting inclusivity into AI training data, transforming historical narratives. The conversation uncovers the challenges in creating diverse images due to entrenched stereotypes and emphasizes the necessity for critical engagement with technology. Larcher also reflects on the artist's role in reshaping history while addressing societal biases, offering insights into her future projects.
AI Snips
Chapters
Books
Transcript
Episode notes
Bias Perpetuation in AI
- AI models trained on biased data perpetuate harmful stereotypes, like over-sexualized images of women.
- This reinforces societal biases and limits the potential for diverse and inclusive future visions.
AI’s Difficulty Generating Women
- Claudia Larcher found generating diverse female faces in AI challenging due to biases in training data.
- Trigger words like "female," "Asian," or clothing items led to censorship, hinting at over-representation of pornographic content.
Mitigation Filters' Ineffectiveness
- Claudia Larcher's art project was inspired by a workshop where she realized AI bias persisted despite mitigation efforts.
- These filters mask the problem without addressing the root cause, hindering true inclusivity.