Replit AI Podcast cover image

Replit AI Podcast

03: The Next Generation of LLMs with Jonathan Frankle of MosaicML

Jun 29, 2023
59:32

Podcast summary created with Snipd AI

Quick takeaways

  • The development of large-scale open source language models like the 30B model showcases significant advancements in model size and capabilities.
  • Larger context sizes in language models offer enhanced power and capabilities while also posing challenges in training costs and model evaluation.

Deep dives

Exciting Advances in Open Source LLM Models

The discussion in the podcast highlights Jotam's new role as the Chief Neural Network Scientist at Databricks, signaling a milestone in the AI industry. The focus is on the development of large-scale open source language models, like the 30B model, which showcases significant advancements in model size and capabilities. The conversation delves into the benefits and challenges of training such large models, emphasizing the potential for improved performance and the importance of architectural enhancements for future LLM developments.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode