80,000 Hours Podcast cover image

#214 – Buck Shlegeris on controlling AI that wants to take over – so we can use it anyway

80,000 Hours Podcast

00:00

Advocating for AI Safety and Alignment

This chapter explores the critical role of dedicated teams focusing on AI safety and alignment within companies. It emphasizes how even a small group of passionate individuals can effect substantial change in ensuring responsible AI development. The discussion also addresses the challenges and opportunities for implementing safety practices and fostering collaboration across the industry.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app