Unsupervised Learning cover image

Ep 2: Databricks CTO Matei Zaharia on scaling and orchestrating large language models

Unsupervised Learning

00:00

Is Scaling the Future of LLM?

The transformer architecture is kind of becoming the de facto architecture that's used, you know, across a bunch of industries. But then it's about how you can actually scale these and like your, your systems person. And so you're like the expert on, on scale. So I think a lot of people are going down this road and saying, hey, I can get X million dollars in investment.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app