Google DeepMind: The Podcast cover image

Speaking of intelligence

Google DeepMind: The Podcast

CHAPTER

The Impact of Large Language Models on the Environment

Gpt three reads huge chunks of the intenet as its training set. It's built from a network with nearly 100 layers of neurons and a hundred and 75 billion parameters. Researchers need to make these models so gigantic because they are much less efficient acquiring language than human brains. Deep mind is looking at ways to mitigate the environmental cost, for example, by creating more efficient ways of training these models.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner