
AI and Existential Risk (Robert Wright & Connor Leahy)
Robert Wright's Nonzero
00:00
The Importance of Scaling in AI
The real innovation is what's called the transformer, which is a, the architecture that GPT uses. When I was doing like, single prediction ais or like reasoning ais and so on, there was always this problem that you could like maybe solve a specific sub problem with some new architecture but then it wouldn't generalize. So we got an architecture you can apply to language, even like vision, sound, pick whatever, and you can just make it bigger. You can add more computing power, more parameters, and it gets better. It learns new higher level patterns. It becomes more robust. There's like weight, the scales. This is nuclear efficient. Mm hmm
Transcript
Play full episode