3min chapter

The Inside View cover image

Simeon Campos on Short Timelines, AI Governance and AI Alignment Field Building

The Inside View

CHAPTER

Coping Governance: A Distraction From Alignment Concerns

Google is asking Google brain to have some interpretive measures for like any model that requires more than 10 to 15 flops or something. I think it's still important when you don't buy alignment but yeah the point I was making is more a you like top organization have potentially an ability to implement norms and these norms can be used by other actors as well. If some orgs started saying beyond like for state of the other models we always run not like interpretivity checks and red teaming before deploying them that could be followed by over orgs.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode