
Audio long-read: Rise of the robo-writers
Nature Podcast
GPT Three Is a Generative Pre Trained Transformer
Gpt three stands for generative pre trained transformer three. It's the third in a series, and is more than 100 times larger than its 20 19 predecessor,. The next largest ash model of its kind has 17 billion. In january, gogle released a model with one point six trillion perameters. This is equivalent to a dense model that has between ten billion and a hundred billion perameters.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.