This chapter delves into new technologies enhancing Large Language Models' learning abilities, including methodologies like Retrieval Augmented Generation and a compressive memory system integrated into transformers to process infinitely long inputs more efficiently. It also touches on the challenges and benefits of continuous learning and scaling language models with longer inputs.
- Pre-show: Marco’s fans
- Follow-up:
- WWDC Preview
- Ask ATP:
- Any chance the expected late-2024 M4 MacBook Pro gets Thunderbolt 5? (via James)
- Any chance we’ll get 16GB RAM in the eventual M4 MacBook Air? (via David Martin)
- Could the lack of cellular Mac be because there’s no Apple modem? (via Winnie Lewis)
- Post-show Neutral: Kid cars, revisited
- Members-only ATP Overtime: Apple’s in-car dreams are a bit of a 🎢
Sponsored by:
- Factor: Healthy Eating, Made Easy. Get 50% off your first box, and 20% off the next month, using code ATP50.
- Squarespace: Save 10% off your first purchase of a website or domain using the code ATP.
Become a member for ATP Overtime, ad-free episodes, member specials, and our early-release, unedited “bootleg” feed!