I think the key is that the honey has lost its sweetness. That's not something that's rational. Once lhike, two things are inevitable, he will the mice are going to finish eating the twig, and you're going to fall into the dragon. But what's not inevitable or logically entailed by anything, is whether you find the honey sweet or not. And i think the fact that he didn't find the honeysweet, the fact that these pleasures are no longer pleasures for him, that is an emotional reaction to the facts. You should get to this life or death point where the fact that you're eventually going to die shouldn't make you just question whether you want
David and Tamler find themselves unable to attach rational meaning to a single act in their entire lives. Let’s say we publish more articles and books. What then? What about our kids? They’re going off to college. Why? What for? We think about the future of the podcast. Let’s say we get bought out by Spotify and become more famous than Joe Rogan, Dolly Parton, and even Yoel Inbar -- more famous than all the podcasters in the world. So what?
And we can find absolutely no reply.
Plus, we take a test to determine whether we can we tell an AI apart from an analytic philosopher. When should we start getting scared of what AIs are gonna do to us, or what we’re doing to them?
*Note: the main segment is on the first half of Tolstoy’s great memoir "A Confession," but you don’t need to be familiar with the text to appreciate the discussion for this one.
Sponsored By:
Support Very Bad Wizards
Links: