2min chapter

"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis cover image

E39: Seeing is Believing with MIT’s Ziming Liu

"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis

CHAPTER

The Attention Layer and the Fully Connected Portion of a Transformer

You mentioned the, the transformer portion. It wasn't entirely clear to me, are you modifying the attention mechanism as well as the fully connected portion of a transformer? Yeah, yeah, yeah, that's a very good question. So think about attention layers. If we put aside the soft max parts of the attention layer, so the attention layer is just three matrix multiplication, the key query and value. We apply the same trick we used in MLP just treated as a linear layer.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode