AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Cost of Scaling a Language Model
Simon: There was a fun tweet yesterday like gpt four they haven't said how big it is we know that three was 175 billion parameters they won't reveal how big four is somebody got a stopwatch and said okay well i'll ask the same question of three and four and time it and four took ten times longer to produce a result so i reckon four is ten times 175 billion parameters. Simon: If i could have the smallest possible model that's that can do this this pattern where it can call extra tools it can makeapi calls and so forth the stuff i could build with that is kind of incredible and it would run on my phone that's the thing i'm most excited