
The Thesis Review
[25] Tomas Mikolov - Statistical Language Models Based on Neural Networks
May 14, 2021
01:19:17
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity.
Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata.
Episode notes: https://cs.nyu.edu/~welleck/episode25.html
Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.