

Could AI ever be used safely for mental health support?
Oct 1, 2025
Jenna Glover, Chief Clinical Officer at Headspace, shares insights on using AI for mental health. She discusses Ebb, Headspace's empathetic AI assistant designed to provide supportive, non-diagnostic help. Jenna highlights the dangers of generic chatbots in therapy, emphasizing the need for purpose-built AI models with clinical safeguards. She argues that while AI can't replace therapists, it can enhance access to mental health support. Glover also presents essential safeguards for the industry, advocating for transparency and evidence-based practices.
AI Snips
Chapters
Transcript
Episode notes
AI As An Empathetic Companion
- Headspace's Ebb uses motivational interviewing to offer empathetic, non-diagnostic support.
- It validates feelings and points users to relevant Headspace content rather than providing therapy.
Context Failures In General Models
- General-purpose chatbots often fail to interpret context correctly in emotional conversations.
- That mismatch can produce irrelevant or harmful responses in moments of distress.
Do Not Use General Models For Crisis
- Avoid using general-purpose models for crisis or clinical situations without clinical guardrails.
- Build purpose-specific safeguards to route people to care and prevent harmful suggestions.