Will Douglas Heaven, a senior editor at MIT Technology Review with a PhD in computer science, dives deep into the prevailing AGI hype. He differentiates between consumer AI and true AGI and explores why tech founders promote inflated promises. The conversation touches on AGI's rise to prominence, comparing believers to conspiracy theorists, and critiques current benchmarks for AI intelligence. Will also emphasizes the importance of balanced journalism in the age of AI hype, encouraging a more skeptical approach to predictions about AGI's impact on society.
56:00
forum Ask episode
web_stories AI Snips
view_agenda Chapters
menu_book Books
auto_awesome Transcript
info_circle Episode notes
insights INSIGHT
Generality, Not Glamour, Defines AGI
AGI is defined by generality: an AI that can do across-the-board tasks a reasonably capable person can do.
Current tools excel at narrow tasks like image or video generation but fail when pushed outside those niches.
insights INSIGHT
Big Promises Finance Big Infrastructure
The industry sells cosmic stakes to justify massive financial and environmental costs.
Grand promises let companies rationalize building hugely expensive infrastructure for uncertain returns.
insights INSIGHT
AGI Rhetoric Follows Conspiracy Patterns
AGI talk mirrors conspiracy mechanics: flexible promises, salvation narratives, and moved timelines sustain belief.
That structure explains both utopian and apocalyptic rhetoric around AI.
Get the Snipd Podcast app to discover more snips from this episode
Have you been having fun with the newest slate of AI tools? Have you been doing research with GPT-5? Coding your projects with Claude? Turning pictures of your friends into cartoon characters from the Fairly Odd Parents using the image editing tool Nano Banana?
Are you impressed with what they can do? Well guess what? You’re only impressed with them because you’re basically a naive child. You’re like a little child with an etch a sketch who is amazed that they can make crude images by turning the knobs, oblivious to greater possibilities. At least, that’s the impression you get when listening to tech leaders, philosophers, and even governments. According to them, soon the most impressive of AI tools will look as cheap and primitive as Netflix’s recommendation algorithm in 2007. Soon the world will have to reckon with the power of Artificial General Intelligence, or “AGI.”
What is AGI? Definitions vary. When will it come? Perhaps months. Perhaps years. Perhaps decades. But definitely soon enough for you to worry about. What will it mean for humanity once it's here? Perhaps a techno utopia. Perhaps extinction. No one is sure. But what they are sure of is that AGI is definitely coming and it’s definitely going to be a big deal. A mystical event. A turning point in history, after which nothing will ever be the same.
However, some are more skeptical, like our guest today Will Douglas Heaven. Will has a PhD in Computer Science from Imperial College London and is the senior editor for AI at MIT Technology review. He recently published an article, based on his conversations with AI researchers, which provocatively calls AGI “the most consequential conspiracy theory of our time.”
Jake and Travis chat with Will about the conspiracy theory-like talk from the AI industry, whether AGI is just “vibes and snake oil,” and how to distinguish between tech breakthroughs and Silicon Valley hyperbole.
Will Douglas Heaven
https://bsky.app/profile/willdouglasheaven.bsky.social
How AGI became the consequential conspiracy theory of our time
https://www.technologyreview.com/2025/10/30/1127057/agi-conspiracy-theory-artifcial-general-intelligence/
Subscribe for $5 a month to get all the premium episodes: https://www.patreon.com/qaa
Editing by Corey Klotz. Theme by Nick Sena. Additional music by Pontus Berghe. Theme Vocals by THEY/LIVE (https://instagram.com/theyylivve / https://sptfy.com/QrDm). Cover Art by Pedro Correa: (https://pedrocorrea.com)
https://qaapodcast.com
QAA was known as the QAnon Anonymous podcast.
The first three episodes of Annie Kelly’s new 6-part podcast miniseries “Truly Tradly Deeply” are available to Cursed Media subscribers, with new episodes released weekly.
www.cursedmedia.net/
Cursed Media subscribers also get access to every episode of every QAA miniseries we produced, including Manclan by Julian Feeld and Annie Kelly, Trickle Down by Travis View, The Spectral Voyager by Jake Rockatansky and Brad Abrahams, and Perverts by Julian Feeld and Liv Agar. Plus, Cursed Media subscribers will get access to at least three new exclusive podcast miniseries every year.
www.cursedmedia.net/
REFERENCES
Debates on the nature of artificial general intelligence
https://www.science.org/doi/10.1126/science.ado7069?utm_source=chatgpt.com
Why AI Is Harder Than We Think
https://arxiv.org/pdf/2104.12871
AI Capabilities May Be Overhyped on Bogus Benchmarks, Study Finds
https://gizmodo.com/ai-capabilities-may-be-overhyped-on-bogus-benchmarks-study-finds-2000682577
Examining the geographic concentration of VC investment in AI
https://ssti.org/blog/examining-geographic-concentration-vc-investment-ai
Margaret Mitchell: artificial general intelligence is ‘just vibes and snake oil’
https://www.ft.com/content/7089bff2-25fc-4a25-98bf-8828ab24f48e