AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of AGI in Life
We don't have to have super intelligent AI it's not a requirement of happy existence we can do all the things we want including life extension with much less intelligent systems. maybe building them is a very bad idea and we should not do that so is it because that such a super intelligence will be running over a long period of time increasing the cumulative risk of failure over say decades or centuries that we can't accept even a tiny probability of failure for these systems? I would suspect it would be a very quick process expecting something to be 100% safe is just unrealistic in any field.