I think there is a level of general intelligence that becomes effectively universal. Once you have a sufficient level ofgeneral intelligence, you should be able to design new cognitive modules. So if you were an engineering superintelligence and you lacked the ability to understand the poetry, you could use your engineering superintelligence to construct additional mental modules for also being able to understand poetry. I think we can distinguish three different flavors of superintelligence. The easiest one conceptually take a human like mine and just imagine that it operated a million times faster. Then you would have some kind of superintelligence type of thing in that this human mind could achieve things that the human could not do within a given interval of time.
Nick Bostrom of the University of Oxford talks with EconTalk host Russ Roberts about his book, Superintelligence: Paths, Dangers, Strategies. Bostrom argues that when machines exist which dwarf human intelligence they will threaten human existence unless steps are taken now to reduce the risk. The conversation covers the likelihood of the worst scenarios, strategies that might be used to reduce the risk and the implications for labor markets, and human flourishing in a world of superintelligent machines.