NLP Highlights cover image

76 - Increasing In-Class Similarity by Retrofitting Embeddings with Demographics, with Dirk Hovy

NLP Highlights

00:00

The Importance of Translation Matrix in Inductive Learning

We find that the classifier actually performs better if it's trained on the original data transformed by the translation matrix because it's an approximation. The least square approximation itself acts a little bit like a regularizer in this case. We could have used a more sophisticated nonlinear model to use on top of this. Another thing we're exploring right now is to learn a non linear transformation matrix or a network essentially that does that. But this is something that we're currently looking into.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app