Another Podcast

How does OpenAI compete?

130 snips
Dec 20, 2025
The hosts dive into the fierce competition surrounding OpenAI and the commoditization of AI models. They examine the infrastructure challenges and the difficulty of maintaining cost/performance advantages. Discussion flows into strategies for product differentiation and the role of AI as a feature. The disconnect between advanced builders and everyday users is highlighted, along with the real-world utility of LLMs. They ponder strategic choices for the future and draw parallels between AI's current landscape and the early days of the web.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Models Look Commoditized Today

  • Large LLMs are roughly commoditized with leaders swapping week-to-week and few users can tell the difference.
  • This limits product defensibility because the science, data and infra approaches are similar across players.
INSIGHT

Infra Alone Isn't A Clear Moat

  • Competing by building cheaper/faster infra is possible but costly and offers limited visible user benefit.
  • Without a clear user-facing performance gap, infra alone is an uncertain long-term strategy.
INSIGHT

Three Strategic Paths Up The Stack

  • You can either add AI as features, build new AI-first experiences, or focus on infra/API platform strategies.
  • Each path faces fierce competition and weak classic network effects compared with historical platform shifts.
Get the Snipd Podcast app to discover more snips from this episode
Get the app