Orchestrate all the Things cover image

Is scaling all you need for AI Large Language Models? Scaling laws and the Inverse Scaling Challenge. Featuring Ian McKenzie, FAR AI Research Scientist

Orchestrate all the Things

00:00

Introduction

The last couple of years have been an AI model heart-race involving a number of players from industry and research. Google, DeepMind, Meta, Microsoft in collaboration with both OpenAI and NVIDIA are the names most people recognize. Large-language models are trained on huge corpora of text and feature parameters that are measured in the billions. The inverse scaling challenge is an initiative set up to investigate this hypothesis.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app