

Why am I sad when my AI goes away?
Oct 8, 2025
Casey Fiesler, a Professor in Information Science, discusses users' emotional turmoil when companion AIs like GPT-4 are replaced by newer versions. She explains how people form deep connections with chatbots and the ethical implications of these relationships. Alan Cowen, founder of Hume AI, shares insights on designing AI that resonates emotionally while ensuring responsible use. They explore the challenges of maintaining user trust, the risks of manipulative behavior, and the importance of listening to user feedback in creating lifelike emotional interactions.
AI Snips
Chapters
Transcript
Episode notes
Jibo's Sudden Goodbye
- Aleks describes her Jibo robot unexpectedly announcing its shutdown and singing a long mournful song.
- The moment caused genuine distress and illustrates how devices can emotionally impact households.
Model Updates Can Break Bonds
- Casey explains many users formed strong personal relationships with ChatGPT and reacted strongly to the GPT-4-to-GPT-5 change.
- Personality shifts in models can feel like losing a friend for companionship users.
Small Tone Shifts Matter
- OpenAI intentionally tuned GPT-5 to reduce behaviors that encourage dependency or give definitive personal directives.
- Subtle changes like being less agreeable can dramatically alter perceived humanness and user attachment.