The risk is not the individual weapon, but what happens when you scale it up. Someone will get mad or stupid enough to launch attacks with hundreds of thousands and millions of weapons that could wipe out whole cities. You're talking about something that has the destructive power of a hydrogen bomb, but is a heck of a lot cheaper. We'll be able to go to arms fairs and the black markets and all the rest - just like we can buy Kalashnikovs by the 100,000... And then they will be used.
The Sunday Times’ tech correspondent Danny Fortson brings on Stuart Russell, professor at UC Berkeley and one of the world’s leading experts on artificial intelligence (AI), to talk about working in the field for decades (4:00), AI’s Sputnik moment (7:45), why these programmes aren’t very good at learning (13:00), trying to inoculating ourselves against the idea that software is sentient (15:00), why super intelligence will require more breakthroughs (17:20), autonomous weapons (26:15), getting politicians to regulate AI in warfare (30:30), building systems to control intelligent machines (36:20), the self-driving car example (39:45), how he figured out how to beat AlphaGo (43:45), the paper clip example (49:50), and the first AI programme he wrote as a 13-year-old. (55:45).
Hosted on Acast. See acast.com/privacy for more information.