1min snip

Philosophize This! cover image

Episode #184 ... Is Artificial Intelligence really an existential risk?

Philosophize This!

NOTE

The Implications of Technology: AGI, Nuclear Weapons, and Bioengineering

The widespread use of new technology is often seen as an experiment our society is running in real time./nNegative effects of new technology should be addressed by the government through regulations to protect the public./nThis strategy works well for low-stakes technologies like vacuum pumps or home appliances, but not for high-stakes technologies like AGI or nuclear technology./nTechnology is not neutral and always carries both affordances (allowing new capabilities) and limitations (taking away previous capabilities)./nIt is important to be skeptical of technologies that claim to be neutral but have power over us./nAGI (Artificial General Intelligence) is a future possibility that some people are concerned about, while others dismiss it as a distant and speculative concern./nThose advocating for AGI risk prevention argue that given the stakes of this technology, waiting for uncertainties to be resolved is not a luxury we can afford.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode