
Nick Bostrom on Superintelligence
EconTalk
00:00
The Non Trivial Probability of a Super Intelligence Transition
Once you have a machine intelligence that reaches sort of human level, or maybe somewhat above human level, you might get a very rapid feedback loop. So if you have a fast transition from human level machine intelligence to super intelligence, then it's likely that you will only have one super intelligence at first before any other system is even roughly comparable. And then this first super intelligence might be very powerful. It could develop all kinds of new technologies very quickly and strategize and plan. But for reasons that I go into some depth about in the book, it looks really hard to engineer a seed AI such that it will result in a super intelligence with human friendly preferences. Maybe we can call it and what
Transcript
Play full episode