
Syntax and the Future of Programming with Josh Warner
Software Unscripted
00:00
The Problems With Current Models
The current models have fixed amounts of compute computation in any step. One chat GPT when it's generating Python to do XYZ, the only thing it knows about are things that were trained on originally. The other problem is training data that data on the internet sucks and it's full of lives and sarcasm which will lead you astray. It seems like it ought to be fairly simple to start to clean that up.
Transcript
Play full episode