AXRP - the AI X-risk Research Podcast cover image

20 - 'Reform' AI Alignment with Scott Aaronson

AXRP - the AI X-risk Research Podcast

00:00

The Importance of Clarification in AI Alignment

The same fact could have completely different explanations depending on what the relevant context is, right? So, explanation is a slippery thing to try to formalize for that reason. Whatever steps we can make toward formalizing it, you know, would be,. you know, a major step forward in just in science in general and in human understanding.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner