Peter Freed, a founding partner at Near Horizon Group and former director of energy strategy at Meta, shares his expertise on data centers and clean energy. He discusses how the AI boom is straining the grid and explores the real-world challenges of developing new data centers by 2025. Freed highlights the potential shift from natural gas to coal and the significant electricity demands of tech companies. He also delves into the complexities of watering the data center boom without compromising on sustainability.
Data center development now prioritizes power availability, leading developers to seek locations with reliable access to clean energy.
Hyperscale data centers typically require substantial infrastructure, but practical challenges often result in a standard 200-megawatt configuration.
The rise of AI-driven applications increases heat generation, prompting a focus on energy efficiency and innovative cooling methods to manage consumption.
Deep dives
Power-First Approach in Data Center Development
The current approach to data center development has shifted significantly towards prioritizing power availability over traditional real estate strategies. Developers now seek locations with reliable access to clean energy, often collaborating with local utilities or pursuing independent power sources to ensure sufficient electricity supply. This change stems from the increasing recognition of power constraints, particularly in markets with high data center demand. As a result, data center infrastructures are being built around accessible energy sources, highlighting the importance of energy strategy in the planning phase.
Challenges of Hyperscale Data Centers
Hyperscale data centers, typically around 200 megawatts in size, have emerged as the standard capacity for large developments. These facilities often require substantial land and utility infrastructure to support their scale, with energy consumption comparable to that of entire small towns. The industry has seen a trend towards larger campuses, though the practical challenges associated with logistics, water availability, and regulatory approvals often lead developers to revert to the more manageable 200-megawatt configuration. This balance aims to streamline the development process while meeting the rising demand for computing power.
The Intersection of Energy Efficiency and AI Demand
The increasing heat generated by advanced AI applications within data centers raises significant issues regarding energy consumption and cooling efficiency. Metrics such as Power Usage Effectiveness (PUE) are crucial to assessing how much energy is used for computing versus cooling, with modern designs pushing towards lower PUE figures via innovative cooling methods like evaporative cooling. As AI technology evolves, cooling solutions are adapting, which further drives efficiency improvements. This dynamic highlights a crucial consideration for data center operators, focusing on the interconnectedness of energy efficiency, water usage, and AI scalability.
Market Speculation and Data Center Demand
A notable trend in the data center market involves speculative behavior, where numerous unsolicited load requests are overwhelming utility companies. This surge is indicative of increased interest in data centers, particularly as firms race to capitalize on the growth of AI and increasing energy needs. However, this speculative influx complicates planning and resource allocation, as utilities struggle to ascertain which projects will actually materialize. The phenomenon of 'vapor watts' illustrates the risk of theoretical energy demand overshadowing genuine project viability, leading to potential grid stability concerns.
Commitments to Clean Energy Amid Industrial Needs
Data center developers are navigating the dual challenges of rapid growth and commitment to clean energy solutions. Many firms maintain their renewable energy commitments, even as they explore options like building new gas-fired plants to supplement their growing energy needs. Recent projects show a blend of gas and solar capacity developments, aimed at meeting demand while adhering to sustainability goals. This scenario underscores the complexities of transitioning to a low-carbon energy model while addressing immediate industrial electricity requirements.
If you care about decarbonizing the power grid anytime soon, you have to care about data centers. The AI boom and the ongoing growth of the internet have driven a big new cycle of data center construction in the United States, with tech companies trying to buy amounts of electricity comparable to those used by large cities.
Peter Freed has seen this up close. As Meta’s former director of energy strategy, he worked on clean energy procurement and data center development from 2014 to 2024. He is now a founding partner at the Near Horizon Group, where he advises investors and companies on emerging topics in data centers and advanced clean energy.
On this week’s episode of Shift Key, Rob and Jesse talk with Peter about whether AI and new data centers are going to blow up the grid and break decarbonization. What are the real-world constraints on developing a data center in 2025? Are tech companies beginning to run out of natural gas to burn? What do their investments in clean energy mean? And could the rise of AI prompt an accidental return to coal? Shift Key is hosted by Jesse Jenkins, a professor of energy systems engineering at Princeton University, and Robinson Meyer, Heatmap’s executive editor.