Gradient Dissent: Conversations on AI cover image

Neural Network Pruning and Training with Jonathan Frankle at MosaicML

Gradient Dissent: Conversations on AI

00:00

The Future of GPT Chat

Right now, we're watching GPT chat really struggle with long context lengths where someone goes back and forth with it for enough iterations. We still don't even know how to solve that basic problem. I'm pretty skeptical that just taking the same things and making them bigger will solve any of these problems. Those are basic problems we're going to have to overcome before we get there.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app