
#017: Ken Miller – Podcasting, AI, ChatGPT, Leadership, Learning, and Fathom
The Prompt
Is There a Pre-Training Layer?
All of these language models are essentially kind of like pre-trained. They're constantly updated, constantly adjusted during that training process. What's really fascinating about it is that in a way, the entire internet has passed through this neural network during the training process to set all of those weights. And yet when you ask it a question, it can throw out all of this information that it's learned.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.