In theory you can have minds with any kind of coherent preferences. If you ask them do they want to be something else they answer no. So why would it kill us? It's really good at creating a very, very thoughtful canoe once now. Or a job interview request that takes much less time and I'm pretty good at those those two things but it's really good in general.
Eliezer Yudkowsky insists that once artificial intelligence becomes smarter than people, everyone on earth will die. Listen as Yudkowsky speaks with EconTalk's Russ Roberts on why we should be very, very afraid, and why we're not prepared or able to manage the terrifiying risks of artificial intelligence.