"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis

The Future of the Transformer Part 2 with Trey Kollmer

11 snips
Oct 20, 2023
Trey Kollmer shares groundbreaking insights on AI, highlighting methods to reduce global compute needs by 10%. He explores the advantages of analogical prompting over few-shot techniques, enhancing how models interact meaningfully. The discussion also covers innovative strategies like ring attention to extend context and improve computational efficiency. With a keen eye on the tech landscape's rapid evolution, the podcast emphasizes the importance of engaging conversations and adapting to new developments in AI.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Compute Needs and Superhuman Performance

  • New techniques can reduce global compute needs for AI training by 10%.
  • Longer context windows allow models to consider entire books, potentially exceeding human learning.
INSIGHT

Thinking and Pause Tokens

  • Language models typically generate tokens immediately, limiting processing time.
  • Pause tokens allow models to manipulate more hidden vectors, similar to chain-of-thought prompting.
INSIGHT

Combining Pause Tokens and Chain of Thought

  • Chain-of-thought prompting allows examination of reasoning, unlike pause tokens.
  • Combining pause tokens with chain-of-thought prompting might yield optimal accuracy and interpretability.
Get the Snipd Podcast app to discover more snips from this episode
Get the app