
#017: Ken Miller – Podcasting, AI, ChatGPT, Leadership, Learning, and Fathom
The Prompt
00:00
Is There a Pre-Training Layer?
All of these language models are essentially kind of like pre-trained. They're constantly updated, constantly adjusted during that training process. What's really fascinating about it is that in a way, the entire internet has passed through this neural network during the training process to set all of those weights. And yet when you ask it a question, it can throw out all of this information that it's learned.
Transcript
Play full episode