Screaming in the Cloud

From Aurora to PlanetScale: Intercom’s Database Evolution with Brian Scanlan

6 snips
Sep 18, 2025
Brian Scanlan, Senior Principal Engineer at Intercom, discusses the company's migration from AWS Aurora to PlanetScale, driven by challenges with scalability and operational pain from 13 clusters. He shares insights on Intercom's transition to AI chatbots post-ChatGPT, emphasizing their role in enhancing human support. Beyond databases, Brian highlights the shrinking talent pipeline for systems engineers and the effectiveness of Intercom's volunteer-based on-call model, enhancing both operations and recruitment. A fascinating dive into tech evolution at Intercom!
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ADVICE

Use AI To Augment Support

  • Use AI chatbots to augment humans, not fully replace them, and keep human fallback paths.
  • Feed bots better documentation and move support staff to higher-value work.
INSIGHT

Better Bots Can Increase Demand

  • Good chatbots can increase total conversation volume because faster, useful answers encourage more questions.
  • High-quality knowledge bases amplify chatbot effectiveness and user engagement.
INSIGHT

Monolith Scalability Tradeoffs

  • Intercom scaled a Ruby on Rails monolith on EC2 and relied heavily on Aurora for years.
  • Aurora's split compute/storage and replicas bought them years but eventually led to complex sharding and migration pain.
Get the Snipd Podcast app to discover more snips from this episode
Get the app