Machine Learning Street Talk (MLST) cover image

#039 - Lena Voita - NLP

Machine Learning Street Talk (MLST)

00:00

Decoding Machine Learning: Visualizations and Hallucinations

This chapter explores the dynamics of language models, focusing on neural machine translation and the phenomenon of model hallucinations. The discussion uses visual metaphors and layer-wise relevance techniques to illustrate how models process information and highlights the impact of training objectives on token representations. Additionally, it emphasizes the importance of data regularity in enhancing model efficiency and reducing hallucination incidents.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app