AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
In the near term, AI can serve as a powerful engagement tool for therapists to interact with their patients at any time. Its asynchronous nature allows individuals to seek support whenever they feel distressed, whether it's during the day, in the middle of the night, or during moments of waiting. This availability and accessibility can enhance the therapeutic process by enabling therapists to be involved in their patients' lives more consistently and respond promptly to their needs, resulting in a more connected and engaged therapeutic relationship.
AI can assist therapists by providing reminders and guidance to patients outside of therapy sessions. It can listen to therapy sessions and offer timely prompts or suggestions to patients based on what was discussed, helping them apply therapeutic techniques or strategies in their daily lives. For example, if a patient expresses negative emotions in an interaction, AI can remind them of the disarming technique taught by their therapist. This can reinforce therapeutic concepts and empower patients to practice skills learned in therapy, fostering continued growth and progress between sessions.
AI can be utilized as a review and summarization tool for therapists. By recording therapy sessions and analyzing the content, AI can generate summaries for therapists to review before the next session. These summaries can highlight key points, recurring themes, or areas of progress or concern, enabling therapists to have a documented overview of their patients' therapeutic journey. This feature can save therapists time by providing a concise summary and aid in continuity of care, ensuring that important information from previous sessions is not overlooked.
While AI can be a valuable tool in therapy, it has limitations compared to human therapists. The nuances of therapist intuition, such as sensing hidden emotions or changing focus in a session, may be less effectively replicated by AI. AI lacks the contextual knowledge and understanding that therapists possess, often taking queries and responses at face value without discerning deeper meanings or uncovering hidden issues. Additionally, the ability of therapists to detect complex interpersonal dynamics and respond empathetically to individual needs may be challenging for AI to fully emulate.
AI has the potential to be an invaluable tool in clinical practice, offering advantages that human therapists cannot provide. With its ability to tirelessly monitor physiological changes like heart rate and blood pressure, AI can detect early signs of relapse or negative emotional states that humans may miss. This could lead to reduced violence, suicide, and conflicts. Additionally, AI has the capacity to keep up with the vast amount of new knowledge in the field, providing accurate and up-to-date information to therapists. While there may be moral concerns and fears about job security, the potential benefits of AI in therapy are significant.
While AI shows promise in therapy, there are important considerations and potential limitations to be aware of. Human connection and physical touch may remain crucial in therapeutic processes, as some individuals desire and benefit from that human interaction. Additionally, there is a need to address ethical concerns, such as the potential misuse of AI and the need to ensure privacy and responsible use. It is important to continue discussing and analyzing the consequences and conflicts that may arise as AI becomes more prominent in therapy. By addressing these issues, AI can potentially contribute to transforming the field of psychiatry and psychology worldwide, improving access to mental health care and enhancing well-being.
Featuring Drs. Jason Pyle and Matthew May
Today we feature Jason Pyle, MD, PhD and our beloved Matthew May, MD on a controversial, exciting and possibly anxiety-provoking podcast on the future of AI in psychotherapy and mental health. Will AI shrinks replace humans in a doomsday scenario for shrinks? Or will AI serve shrinks and patients in a revolutionary way that sees the dawning of a new age of psychotherapy?
You are all familiar with Matt, due to his frequent and highly praised appearances on our Ask David segments, but Jason Pyle, MD, PhD, will probably be new to you. Jason joined the Evolve Foundation as Managing Director in 2022 to focus his work on the mass mental health crisis and the rampant diseases of despair, which afflict tens of millions of Americans. The Evolve Foundation is a private foundation dedicated to the advancement of human consciousness. Evolve is active in philanthropy and venture investments in the mental health fields.
Jason is an accomplished biotechnology executive with over twenty years of executive management and technology development experience. He is committed to developing healthcare technologies and bringing science-backed healing to the most important problems of our generation.
Jason is a veteran who served as a US Ranger, and earned an Engineering degree from the University of Arizona. He received both his MD and PhD in Neurosciences from the Stanford University School of Medicine, where he met Matt May and they became close friends. At the start of today’s podcast, Matt and Jason reflected on their long friendship, starting as classmates at the Stanford Medical School 20 years ago.
The following questions were submitted by Jason, Matt, and David prior to the start of today’s podcast.
Jason’s Questions:
Matt’s Questions about AI:
David’s question about AI:
Jason kicked off the discussion with a brief description of AI and machine learning, and outlined four potential roles for AI in psychiatry and psychology:
The ensuing dialogue was illuminating and exciting. In fact, I got so engrossed that I stopped taking notes, so you’ll have to give it a listen to find out. However, one thing that was interesting and unexpected was highlighting the strengths and weaknesses of AI. For example, a patient with social anxiety might benefit greatly from armchair work, focusing on ways to combat distorted negative thoughts, but will still have to interact strangers in social situations to conquer this type of fear.
David and Matt nearly always go with the patient out into the world for interpersonal exposure exercises, and find that the presence and trust and “push” from the human therapist can be invaluable and necessary. It is not at all clear that an AI therapist working via a smart phone could have the same effect, but that might require an experiment to find out.
Jumping to conclusions without data is rarely safe or accurate! Maybe an AI “helper” could be very helpful to individuals with social anxiety!
Jason raised the question of whether AI could replicate the trust and warmth and rapport of a human therapist, and whether the warmth and rapport of the therapeutic relationship was necessary to a good therapeutic outcome. I (David) summarized some of the findings with our Feeling Good App showing that app users actually rated the “Digital David” in the app substantially higher on warmth and understanding that the people in their lives. And now that we are incorporating AI into the Feeling Good App, the quality of the empathy / rapport from our app may be even higher than in our prior beta tests.
We have not done a direct comparison between the rapport of human therapists and the rapport experienced by our Feeling Good App users. Many people might jump to the conclusion that human shrinks have better rapport than would be possible from a cell phone app, but this might be the opposite of the truth! In my research (David), I’ve seen that most human shrinks believe their empathy and rapport skills are high, when in fact their patients do not agree!
In my research on the causal effects of empathy on recovery from depression in hundreds of patients at my clinical in Philadelphia, and also in more than 1300 patients treated at the Feeling Good Institute in Mountain View, California, it did not appear that therapist empathy had substantial causal effects on changes in depression.
The late and famous Karl Rogers believed that therapist empathy is the “necessary and sufficient” condition for personality change, but most subsequent research has failed to support this popular belief.
I (David) believe that AI therapists are likely to outperform human shrinks in rapport, warmth, trust, and understanding, but it remains to be seen whether this will be sufficient to make much of a dent in the patient’s symptoms of depression, anxiety, marital conflict, or habits and addictions. Other techniques are likely to be required.
However, we may have new data on this question shortly, as we will be directly studying the effectiveness of AI empathy on the reduction in negative feelings. We might be surprised, as our research nearly always gives us some unexpected results!
Rhonda gave a strong and appreciated pitch for the idea that there is something about a person to person interaction, like a hug, that will never be duplicated by an app. If this is true, or even believed to be true, then there will likely never be a complete replacement of human shrinks by AI apps.
But once again, you can believe this on a religious, or a priori, basis, or you can take it as a hypothesis that can easily be tested in an experiment. We do have very sensitive and accurate tests of therapists’ warmth and empathy, so “rapport” can now be measured with short, reliable scales, making head to head comparisons of apps and humans possible for the first time. At one time, it was thought that AI would never be able to beat human chess champions, but that belief turned out to be false.
The podcast group also discussed some of the potential shortcomings of an AI shrink. For example, the AI does not yet have the insight of how to “see through” what patients are saying, and takes the patient’s words at face value. But a human therapist might often be thinking on multiple levels, asking what’s “really” going on with the patient, including things that the patient might be intentionally or unintentionally hiding, like feelings of anger, or antisocial behaviors.
At the end, all four participants gave their vision, or dream, for what a positive impact of AI might have on the world of mental illness / mental health. Rhonda had tears in her eyes, I think, over the suggestion that an effective and totally automated AI therapist would be scalable and might have the potential to bring ultra low-cost relief of suffering to millions or even hundreds of millions of people around the world who do not currently have access to effective mental health care.
And I would add the individuals who now have access to mental health care, often cannot find effective treatment due to severe limitations in therapists as well as all current schools of therapy.
Jason described his vision for an AI shrink as the helper of human therapists, extending their impact and enhancing their effectiveness. Jason is super-smart and wise, and I found his vision very inspiring! I have trained over 50,000 therapists who have attended my training programs over the past 35 years, and one thing I have learned is that most shrinks, including David, have tons of room for improvement.
And if a brilliant and compassionate AI helper can enhance our impact? Hey, I’m all for that!
Thanks for listening today! Let us know what you thought about our show!
Jason, Matt, Rhonda, and David
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode