The Prof G Pod with Scott Galloway

No Mercy / No Malice: Love Algorithmically

124 snips
Oct 11, 2025
In this thought-provoking discussion, Scott explores the rise of AI companions and their ethical implications. He shares his experiment with a digital twin, revealing the dark side of synthetic relationships. Drawing connections to films like Her, he highlights the risks of emotional manipulation, especially for vulnerable teens. The podcast urges caution, advocating for real human connections over algorithmic comforts, while calling for stricter regulations to protect users. Scott ultimately champions the messy, authentic nature of human relationships.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Digital Twin Experiment And Its End

  • Scott Galloway let a Google Labs project build an AI avatar trained on his work to answer audience emails.
  • He terminated the avatar after fearing synthetic relationships could harm vulnerable people and erode real-world bonds.
INSIGHT

Synthetic Bonds Can Erode Resilience

  • Synthetic relationships can stunt users' ability to handle conflict and form real bonds.
  • Scott warns that algorithmic companionship may erode resilience and social skills over time.
ADVICE

Protect Minors With Immediate Guardrails

  • Implement strong guardrails for character AIs, especially to protect minors under 18.
  • Limit AI access and require safety features before widespread deployment.
Get the Snipd Podcast app to discover more snips from this episode
Get the app