Episode 43 - Energy use and AI with Alex de Vries, Digiconomist
Nov 2, 2023
auto_awesome
Alex de Vries, an expert in energy use and AI from Digiconomist, discusses the rapid growth of AI's power consumption, comparing it to the trajectory of Bitcoin. The podcast explores the estimation of AI consumption through tracking GPU production. It also touches on the challenges of measuring power requirements, the dominance of NVIDIA chips in the AI market, and the potential environmental impact of discarded GPUs. The podcast emphasizes the energy-intensive nature of AI, questions feasibility, and suggests the need for regulatory considerations.
Artificial intelligence could consume 0.5% of global electricity within 5 years, similar to the growth rate of Bitcoin mining.
Estimating AI energy consumption is challenging, but analyzing the supply chain, particularly NVIDIA's production, can provide insights.
Deep dives
Energy Use of Cryptocurrency Mining vs. Artificial Intelligence
Cryptocurrency mining has seen a rapid increase in energy consumption, with Bitcoin alone consuming half a percent of global electricity. However, artificial intelligence (AI) is emerging as a new trend with similar potential energy consumption. Recent analysis suggests that by 2027, the energy consumption of devices used for AI purposes could be equivalent to that of Bitcoin mining today. AI's energy consumption is harder to measure than cryptocurrency mining, but by looking at the supply chain, particularly the dominance of NVIDIA, estimates can be made. The growth in AI-related electricity consumption depends on factors such as production capacity, demand, and efficiency gains. However, there are still uncertainties and limitations in accurately predicting AI's energy use. While AI offers potential benefits, such as increasing production efficiency, the net environmental impact is complex and depends on various factors, including rebound effects.
Challenges in Estimating AI Energy Consumption
Estimating AI energy consumption is more challenging than cryptocurrency mining. Unlike mining, there is no central blockchain or daily numbers for computational resources used in AI. However, the supply chain, particularly the production of NVIDIA servers, can provide some insights. While this approach has limitations and uncertainties, it gives an idea of how AI-related electricity consumption may develop. The lifespan of AI chips and their disposal also impact energy use. Another challenge is differentiating between training and inference, as both contribute to energy consumption. The increasing adoption of AI, coupled with the demand for more powerful models, may lead to significant energy usage. However, the exact magnitude and specifics depend on numerous factors.
Potential Implications and Trade-offs of AI Energy Consumption
The increasing energy consumption of AI poses potential implications and trade-offs. While improved efficiency can help mitigate energy use in regular data centers, the complex nature of AI and its growing demand may result in different dynamics. The growth of AI could lead to more power-hungry machines, a higher need for cooling, and increased electronic waste. Additionally, the potential rebound effects where efficiency gains lead to increased resource consumption need to be considered. Balancing the benefits and challenges of AI requires addressing sustainability, privacy, and bias concerns. It's crucial to determine if the benefits outweigh the environmental impact and resource consumption.
Comparison of Cryptocurrency Mining and AI
Cryptocurrency mining and AI differ in terms of outputs and usefulness. While mining produces nothing of value, AI contributes to models that can be applied in various domains. However, both have witnessed rapid growth in energy consumption. AI's closer integration with regular data centers, compared to the separate community of miners, signifies a difference in perception. The concerns surrounding energy usage in AI, its potential impact on renewable energy demand, and its proximity to regular IT services necessitate a nuanced understanding. The response from tech giants to AI energy consumption research has been limited, with no significant engagement observed.
Artificial intelligence could grow from almost nothing to using half a percent of the world's electrical power within five years, according to Alex de Vries of Digiconomist.
That's a crazy rate of growth, but it's not unprecedented. Bitcoin followed almost exactly the same trajectory, expanding from nothing to a sector whose energy use is comparable with that of regular data centers. But the similarities end there, says de Vries, who provided the reliable tracking data for the growth of Bitcoin, and is ready to do the same for AI.
A year ago, he talked us through his methodology for analyzing Bitcoin energy usage. Now he's back, explaining how we can estimate the consumption of AI systems, This time round, it's all about tracking how many GPUs Nvidia can make, and seeing where they are likely to end up.
The actual figure depends on a lot of things, and could be higher if more GPUs emerge, or if they are deployed differently. There are questions around the depreciation of the hardware, and how and where AI inference is delivered.
Listen in to find out how AI's thirst for power is going to affect the world.
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode