This is also a huge big step relative to the original llama. It's of course largely the same architecture, but it is now trained on two trillion tokens. And yes, has this dama to chat variant that's been trained at data with over 1 million new human annotations. So they really curated a lot of data specifically for this chat use case.
Our 130th episode with a summary and discussion of last week's big AI news!
Co-hosted this week by Jon Krohn of the Super Data Science Podcast podcast.
Correction: Elon Musk's company is named xAI, not x.AI.
Read out our text newsletter and comment on the podcast at https://lastweekin.ai/
Email us your questions and feedback at contact@lastweekin.ai
Timestamps + links:
- (00:00) Intro / Banter
- (07:30) Response to listener comments / corrections
- Tools & Apps
- Applications & Business
- Projects & Open Source
- Research & Advancements
- Policy & Safety
- Synthetic Media & Art
- (01:44:20) Outro