2min chapter

Practical AI: Machine Learning, Data Science, LLM cover image

🤗 All things transformers with Hugging Face

Practical AI: Machine Learning, Data Science, LLM

CHAPTER

Attention Is All You Need

The original transformer paper has the title attention is all you need. It's kind of the key aspect of what makes a transformer. What attention does is it uses that distribution, the probability of each of the five things and feeds that probability back into the model itself. So imagine I have a sentence like the man walked the dog and I want to predict the next word in that sentence. Those previous five words would be the five items I'd want to choose from. Attention would say how much weight should I give to each of those previous five words when trying to decide on the next word.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode