Artificial general intelligence (AGI) is a hypothetical future type of AI that could outcompete humans across many domains, such as long-term planning and social persuasion. Some researchers think AGI will eventually present an extinction risk but don't want to work on it for now. These researchers might still support slowing down or restrictingAI development when AGI is closer than we are right now.
A reading of "Given Extinction Worries, Why Don’t AI Researchers Quit? Well, Several Reasons" by Daniel Eth.
ABOUT THE AI BREAKDOWN
The AI Breakdown helps you understand the most important news and discussions in AI.
Subscribe to The AI Breakdown newsletter: https://theaibreakdown.beehiiv.com/subscribe
Subscribe to The AI Breakdown on YouTube: https://www.youtube.com/@TheAIBreakdown
Join the community: bit.ly/aibreakdown
Learn more: http://breakdown.network/
Twitter: https://twitter.com/nlw / https://twitter.com/AIBreakdownPod