Stratechery cover image

DeepSeek FAQ

Stratechery

00:00

Intro

This chapter examines the recent breakthroughs made by DeepSeek in their AI models, R1 and V3, emphasizing innovations like mixture of experts (MOE) and multi-head latent attention (MLA). It also addresses the political ramifications of these advancements, drawing parallels to historical events such as those involving Huawei.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app