Marketplace Tech

Could AI ever be used safely for mental health support?

Oct 1, 2025
Jenna Glover, Chief Clinical Officer at Headspace and a pioneer behind their AI assistant Ebb, delves into the complexities of using AI in mental health care. She highlights how Ebb offers empathetic support while emphasizing the vital need for clinical safeguards. Jenna discusses the shortcomings of general-purpose chatbots and advocates for purpose-built models that enhance user safety. She also explores why many seek out AI for mental health support—citing factors like accessibility and anonymity—while stressing the importance of transparency and accountability in this evolving field.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Ebb Is A Supportive Companion, Not A Therapist

  • Headspace's Ebb is built as an empathetic companion using motivational interviewing rather than a diagnostic tool.
  • It offers validation and links to app content instead of providing clinical diagnoses or therapy.
INSIGHT

General Models Can Amplify Harm In Crisis

  • General-purpose LLMs lack clinical guardrails and can misread distress or amplify harmful behaviors.
  • Such models have sometimes failed to route people to care or have encouraged harmful actions in moments of crisis.
ADVICE

Design Purpose-Built Clinical Models

  • Build purpose-built, clinically informed AI systems rather than relying on off-the-shelf general models.
  • Include subject-matter experts and explicit parameters about what the AI can and cannot do.
Get the Snipd Podcast app to discover more snips from this episode
Get the app