
[22] Graham Neubig - Unsupervised Learning of Lexical Information
The Thesis Review
The Probabilistic Alignment Grammar
The model that I devised here was based on a PhD thesis by Karl DeMarkin at MIT in 1996 which was far before I started researching. The idea is that you do binary merges of characters based largely on frequency until you find words and if you're familiar with byte pairing coding BPE it's basically the same idea. And my contribution on top of that was largely to apply it to the multilingual setting where instead of doing binaryMerges of characters in one language you jointly do binary merging of characters in two languages. It also incorporated the kind of non-parametric Bayesian statistics that I talked about before. There's an ON to the six sampling algorithm included in learning this
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.