The Inside View cover image

Simeon Campos on Short Timelines, AI Governance and AI Alignment Field Building

The Inside View

00:00

Coping Governance: A Distraction From Alignment Concerns

Google is asking Google brain to have some interpretive measures for like any model that requires more than 10 to 15 flops or something. I think it's still important when you don't buy alignment but yeah the point I was making is more a you like top organization have potentially an ability to implement norms and these norms can be used by other actors as well. If some orgs started saying beyond like for state of the other models we always run not like interpretivity checks and red teaming before deploying them that could be followed by over orgs.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app