The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch cover image

20VC: AI Scaling Myths: More Compute is not the Answer | The Core Bottlenecks in AI Today: Data, Algorithms and Compute | The Future of Models: Open vs Closed, Small vs Large with Arvind Narayanan, Professor of Computer Science @ Princeton

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch

NOTE

Innovation Beyond Size

Historically, improvements in model performance have stemmed from increasing model size and training data; however, the trend is shifting as the growth of model parameters faces limitations. The advancement from GPT 3.5 to GPT 4 highlights the significance of model size, but future developments might dwindle in magnitude given that data is becoming a bottleneck and the models have largely saturated the available data pool. While additional compute resources remain beneficial, there is a growing ability to create smaller models that maintain comparable performance levels to their larger predecessors. This indicates a trend towards efficiency rather than sheer size, raising skepticism about the potential leap in capabilities with future models like GPT-5 compared to previous iterations.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner