DeepMind's new language AI is small, but mighty. They outperform gigantic models that have 20 times the parameters with this retrieval database. There's two trillion words in there. So it's not like you have less data overall per se, but because the model is smaller, it's easier to train and it's faster to get outputs.
Happy new year!
Our last episode of 2021 is also our 82th episode with a summary and discussion of last week's big AI news!
Outline:
Subscribe: RSS | iTunes | Spotify | YouTube
Feel free to email us your thoughts or feedback at contact@lastweekinai.com