i was always curious what your thoughts are on ai in the con text of potential disaster? Well, i think one has to steer clear of a two science fiction a a scenario hat that we arrive at artificial general intelligence. I mean, that's clearly a great science fiction plot, but it's not a very likely reality. After all, it's proving to be very difficult, indeed, to get cars to drive themselves. A the idea that we're going to build a doomsday machine that has the capacity to wipe us out seems a bit unlikely.
Disasters are inherently hard to predict. Pandemics, like earthquakes, wildfires, financial crises, and wars, are not normally distributed; there is no cycle of history to help us anticipate the next catastrophe. In this episode, Michael Shermer speaks with one of the world’s most renowned historians, Niall Ferguson, who explains why our ever more bureaucratic and complex systems are making us worse, not better, at handling disasters.