A neural network is a complicated piece of linear algebra that uses many matrices and weights to produce a, a distribution over tokens. It's simpler to think of it if it is just pretend that a token is a word, right? And this is the form that I think most people can sort of understand intuitively from like their experience with like autocomplete on their phone, right? That you can imagine that there's some process that just like looks over all the text that I've typed so far, and then sees what words are likely to follow them.
This is a special preview episode of The Cognitive Revolution: How AI Changes Everything. Hosted by Erik Torenberg and Nathan Labenz, TCR hosts in-depth interviews with the creators, builders and thinkers pushing the bleeding edge of AI. On this episode, they talk with Riley Goodside, the first Staff Prompt Engineer at Scale AI and expert in prompting LLMs and integrating them into AI applications.
Check out The Cognitive Revolution The perfect AI interview complement to The AI Breakdown https://link.chtbl.com/TheCognitiveRevolution Find TCR on YouTube: https://www.youtube.com/@CognitiveRevolutionPodcast