2min snip

Gradient Dissent: Conversations on AI cover image

Reinventing AI Agents with Imbue CEO Kanjun Qiu

Gradient Dissent: Conversations on AI

NOTE

Cultivating Community Drives Innovation

Opening the tooling surrounding model training instead of the models themselves fosters trust and scalability in research. By enabling smaller-scale experiments, researchers can validate outcomes before scaling to larger operations, thus decentralizing knowledge and capabilities in training large models. Collaborative efforts, such as those by Percy Liang and the Stanford initiative, aim to build a robust open-source community akin to the software movement, which is essential for the ecosystem's growth. Sharing insights from the complexities of training at scale, including strategies to prevent instabilities, empowers other research groups, potentially leading to breakthroughs in model building that have not yet been achieved.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode