2min chapter

Lex Fridman Podcast cover image

Rajat Monga: TensorFlow

Lex Fridman Podcast

CHAPTER

TenserFlow

The downside of so many people being excited about tenser flow and becoming to rely on it in many other applications is that you're kind of responsible. You're responsible for previous versions to some degree, still working. And yes, it does come a huge cost. There's a we have to think about a lot of things as we do new things and make new changes. Thata two point o does break some bak compatibility, but not too much. It seems like the conversionis pretty straightforward. A do do you think that's still important, given how quickly deep learning is changing? Can you just start over? Or is there pressure to naught?

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode