Android Developers Backstage cover image

Episode 120: Machine Learning

Android Developers Backstage

00:00

Is There a One-to-One Mapping of Words?

The new architecture, this transformer network does so well. It's able to better model these long-term dependencies like it doesn't forget what happened at the beginning of the sentence by the end because of the attention mechanism. How you represent words is very important and they have multiple senses. So you can understand this word in the context of the words around it. And that's why this kind of attention-based models do so well because they can adapt,. They can understand in this context we even bank as in like somewhere that you store money or versus on the side of the river.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app