

DeepSeek Update, Market Crash, Timeline in Turmoil, Is VC Cooked, Zero Cope Policy
15 snips Jan 28, 2025
The podcast dives into the recent tech market crash, focusing on Nvidia's stock woes and the implications of Jevins’s paradox. It explores Nvidia's pivotal role in AI advancements and the rise of competitors like AMD. A fascinating story reveals a professor misled into posing as an English teacher in a deceptive educational scheme. The discussion includes the competitive landscape of AI, data privacy concerns, and the evolving venture capital scene amid emerging technologies. A lighthearted yet serious reflection on the challenges and opportunities in AI adds depth.
AI Snips
Chapters
Transcript
Episode notes
Shifting Scaling Laws
- AI progress has shifted from pre-training scaling laws (more data, compute) to inference time compute scaling.
- Inference time compute scaling focuses on optimizing the compute used when querying the model, not just training it.
Chain of Thought
- Chain of thought (COT) models improve reasoning by generating intermediate logic tokens, increasing inference compute.
- While COT models improve accuracy, this internal monologue makes them more expensive.
Early GPT-3 Hallucinations
- Speaker 0 recalls early GPT-3 experiments where it would hallucinate, giving contradictory advice.
- The example involves generating tips for quitting a video game, but it promoted more engagement instead.