Generative AI has developed so quickly in the past two years, massive breakthroughs seemed more a question of “when” rather than “if.” But in recent weeks, Silicon Valley has become increasingly concerned that advancements are slowing. One early indication is the lack of progress between models released by the biggest players in the space. OpenAI is reportedly facing a significantly smaller increase in quality for its next model GPT-5, while Anthropic has delayed the release of its most powerful model Opus, according to wording that was removed from its website. Even at tech giant Google, its upcoming version of Gemini is reportedly not living up to internal expectations. If progress is plateauing, it would call into question a core assumption that Silicon Valley has treated as religion: scaling laws. The idea is that adding more computing power and more data guarantees better models to an infinite degree. But those recent developments suggest they may be more theory than law. The key problem could be that AI companies are running out of data to train models on, hitting what experts call the “data wall.” Instead, they’re turning to synthetic data, or AI-generated data. CNBC’s Deirdre Bosa explores whether AI progress is slowing, and what it means for the industry.
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode