

#411 Dr Ross Harper from Limbic: Is It Unethical Not to Use AI in Mental Health Care? (Part 1)
Aug 20, 2025
Dr. Ross Harper, CEO and co-founder of Limbic, is pioneering clinical-grade AI tools to transform mental health care. He discusses the impact of AI technology on diagnostics and treatment, especially since the pandemic. The conversation dives into the accessibility challenges within the UK mental health system, highlighting how AI can reduce waiting times and address minority demographic needs. Harper emphasizes the importance of integrating technology while maintaining a human touch, making a compelling case for responsible AI deployment in mental health.
AI Snips
Chapters
Transcript
Episode notes
Clinical AI Requires Medical-Grade Evidence
- Clinical AI must meet a high bar of evidence, regulation, and integration to be trusted in healthcare.
- Limbic is a regulated medical device with peer-reviewed evidence, unlike wellness chatbots.
Integrate AI Into Clinical Workflows
- Do design AI tools to integrate into clinical workflows and records rather than as standalone wellness apps.
- Integration enables measurable service-level efficiencies and trust from clinicians and health systems.
Ethical Case For Scaling With AI
- There are hundreds of millions with diagnosable mental health conditions going without care.
- Ross argues it's an ethical imperative to use AI and automation to scale limited clinician capacity.