AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Curious Case of a Neural Text Degeneration in 2019
We found that if you try to look for argumacs, the best probability sequence out of your neural language models, then you get degenerate text. So we've done a lot of application scenarios, even including machine translation. And we could demonstrate that across the board, you can improve the performance right away. In some cases, even using neurologic decoding on top of unsupervised off-the-shelf GPT two can do better than supervised model based on beam search. This is really unexpected empirical result about how well these algorithms work.