Latent Space: The AI Engineer Podcast — Practitioners talking LLMs, CodeGen, Agents, Multimodality, AI UX, GPU Infra and all things Software 3.0 cover image

Latent Space: The AI Engineer Podcast — Practitioners talking LLMs, CodeGen, Agents, Multimodality, AI UX, GPU Infra and all things Software 3.0

MPT-7B and The Beginning of Context=Infinity — with Jonathan Frankle and Abhinav Venigalla of MosaicML

May 20, 2023
01:06:43

Podcast summary created with Snipd AI

Quick takeaways

  • MosaicML has released MPT-7B, a transformer model with a context length of up to 84,000 tokens, optimized for fast training and licensed for commercial use.
  • MosaicML aims to make training more affordable and efficient by optimizing infrastructure and developing tools for evaluation, contributing to the collective progress of the AI research community.

Deep dives

The importance of openness in the research community

Mosaic ML values openness and aims to share knowledge and resources with the community. They believe in being transparent and open about their research, tools, and findings. They strive to provide resources that help people train better models and make the overall process more accessible and affordable. By sharing insights and advancements, Mosaic ML aims to contribute to the collective progress of the AI research community.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode