

OpenAI's GPT-5 In 2025?, Big Tech’s Big AI Spend, The Polymarket Election
12 snips Nov 1, 2024
Ranjan Roy, a tech news contributor at Margins, dives into the latest buzz about OpenAI’s GPT-5 being postponed to 2025. They discuss the shift in focus for OpenAI from big models to enhancing product experiences. Roy highlights Meta's ambitious AI projects using over 100,000 GPUs and the competition in the search market. The conversation also touches on the paradox of big tech investing heavily in AI while laying off employees, and concludes with a light-hearted yet insightful look at the upcoming election and its intersection with tech trends.
AI Snips
Chapters
Transcript
Episode notes
GPT-5 Delay
- OpenAI might delay GPT-5 due to escalating competition and the need for a larger GPU cluster.
- Competitors like XAI and Meta are training models on 100,000+ H100 GPUs, potentially surpassing OpenAI's compute power.
Product Focus
- OpenAI may prioritize product development over foundational model advancements.
- Focusing on existing models like GPT-4 and enhancing products like ChatGPT could yield better financial outcomes.
Sora's Delay
- The substantial compute resources needed for video generation might cause Sora to be unavailable until 2025.
- Generating high-quality video content is computationally expensive and requires safety considerations.