I'm going to call it a black box, which means we don't understand what's happening inside. That's the gradient descent. It's heading in the direction of a smaller loss and a better prediction. And let me just say one other thing, which I haven't said enough in our, you know, my preliminary conversations on this topic. And that is, this is one of the greatest achievements of humanity that we could possibly imagine. But so it's going to be very hard to give it up.
Eliezer Yudkowsky insists that once artificial intelligence becomes smarter than people, everyone on earth will die. Listen as Yudkowsky speaks with EconTalk's Russ Roberts on why we should be very, very afraid, and why we're not prepared or able to manage the terrifiying risks of artificial intelligence.