Philosophize This! cover image

Episode #184 ... Is Artificial Intelligence really an existential risk?

Philosophize This!

NOTE

The Implications of Technology: AGI, Nuclear Weapons, and Bioengineering

The widespread use of new technology is often seen as an experiment our society is running in real time./nNegative effects of new technology should be addressed by the government through regulations to protect the public./nThis strategy works well for low-stakes technologies like vacuum pumps or home appliances, but not for high-stakes technologies like AGI or nuclear technology./nTechnology is not neutral and always carries both affordances (allowing new capabilities) and limitations (taking away previous capabilities)./nIt is important to be skeptical of technologies that claim to be neutral but have power over us./nAGI (Artificial General Intelligence) is a future possibility that some people are concerned about, while others dismiss it as a distant and speculative concern./nThose advocating for AGI risk prevention argue that given the stakes of this technology, waiting for uncertainties to be resolved is not a luxury we can afford.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner