4min chapter

Brave New World -- hosted by Vasant Dhar cover image

Ep 58: Sam Bowman on ChatGPT & Controlling AI

Brave New World -- hosted by Vasant Dhar

CHAPTER

Alignment Problems

Alignment is the problem of taking that system and making it do the thing that you want, sort of giving it a goal and having it pursue that goal. And it can be hard. All of the sort of silly, silly failure demos that you see on Twitter with chat2PT are sort of classic examples of alignment failure. You've got problems where models don't quite learn the goals that we try to give them or even if they do learn the goals doesn't always work out as planned.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode