ToKCast cover image

Ep 255: Does this research explain how LLMs work?

ToKCast

00:00

Transformers Literally Compute Probabilities

Brett notes vectors and attention weights function as probabilities, making Bayes' theorem a natural description.

Play episode from 42:58
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app