
Catherine Olsson and Nelson Elhage: Anthropic, Understanding Transformers
The Gradient: Perspectives on AI
What Is in Context Learning and Induction Heads?
Some models, especially these large language models starting with GPT two and GPT three are very good at incorporating information that you give them in the context or in the prompt to kind of vary their text generation. They sort of learn from the prefix in a very sophisticated and nuanced way. If you give them a prompt that is serious Q&A with the scientist, they start generating text that is in the serious scientist voice. And they sort of they react very in a very nuanced way or very reactive way to the earlier context of the prompt.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.