
Émile Torres on Transhumanism, Longtermism and Existential Risks
Singularity.FM
00:00
The Importance of Existential Risk
I think it's highly premature to call GPT 4 like anything like, you know, close to being an AGI. So I'm fully aware that I may be totally and completely wrong here. But it's not obvious at the moment that GPT 4 is like really is closer to AGI or you know, is this little stepping stone of self improvement. We're all going to die because that's going to be AGI. Yeah.
Transcript
Play full episode