3min chapter

Machine Learning Street Talk (MLST) cover image

#104 - Prof. CHRIS SUMMERFIELD - Natural General Intelligence [SPECIAL EDITION]

Machine Learning Street Talk (MLST)

CHAPTER

Is in-Context Learning Really Powerful?

TAT-GBT is the first example of an AI which apparently can recognize abstract concepts like Christmas. It's trained with reinforcement learning from human feedback, right? So, it's trained from human preferences as well. We don't know what the limits of this approach are, and maybe there are many, many things that you can do with this. My point is that we need to think more about what the problem is, about what we want AI to do. And one of the points that I try to make in the book is that we needs better answers to those questions.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode