Google DeepMind: The Podcast cover image

Speaking of intelligence

Google DeepMind: The Podcast

00:00

The Impact of Large Language Models on the Environment

Gpt three reads huge chunks of the intenet as its training set. It's built from a network with nearly 100 layers of neurons and a hundred and 75 billion parameters. Researchers need to make these models so gigantic because they are much less efficient acquiring language than human brains. Deep mind is looking at ways to mitigate the environmental cost, for example, by creating more efficient ways of training these models.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app