
Episode 13: Objections to Artificial General Intelligence
The Theory of Anything
00:00
The Importance of Universal Explainers
David Wheeler: We can conceive the idea of a partial universal explainer or a universal explainer that only explains some things but we have no good reason to believe that that would just be a bad explanation at this point. Therefore the better explanation is that we can't explain everything. He says even though there may be things that are in principle unknowable that doesn't necessarily mean that the set of all noble things is finite. So there's still infinite room for progress to be made and for knowledge to be created.
Transcript
Play full episode