Singularity.FM  cover image

Émile Torres on Transhumanism, Longtermism and Existential Risks

Singularity.FM

00:00

The Importance of Existential Risk

I think it's highly premature to call GPT 4 like anything like, you know, close to being an AGI. So I'm fully aware that I may be totally and completely wrong here. But it's not obvious at the moment that GPT 4 is like really is closer to AGI or you know, is this little stepping stone of self improvement. We're all going to die because that's going to be AGI. Yeah.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app