Eliezer Yudkowski is the founder of the Machine Intelligence Research Institute. He's an outspoken voice on the dangers of artificial general intelligence. Eliezer: You can't in principle survive creating something much smarter than you. It would require precision and preparation, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers.
Eliezer Yudkowsky insists that once artificial intelligence becomes smarter than people, everyone on earth will die. Listen as Yudkowsky speaks with EconTalk's Russ Roberts on why we should be very, very afraid, and why we're not prepared or able to manage the terrifiying risks of artificial intelligence.