The growing demand for electricity driven by artificial intelligence (AI) has become a major topic in energy markets. The International Energy Agency (IEA) projects that by 2030, the power required to operate new and existing AI data centers could exceed 945 terawatt-hours (TWh), which is more than Japan's annual electricity consumption.
However, historical comparisons suggest caution when interpreting such forecasts. In 1999, the US coal industry predicted that information technology would consume half of the nation's electricity by 2020, while Intel estimated that connecting one billion PCs to the web would require as much power as the entire US grid at that time. These predictions did not materialize. Instead, technological advances and shifts in consumption patterns led to different outcomes. For example, between 2010 and 2018, global data center computing increased over 550 times, but energy use in data centers rose only 6%. Currently, internet-related activities account for less than 2% of total Western power usage.
Several factors may limit AI’s long-term impact on energy consumption:
First, advancements in AI model design are making systems more efficient. Recent developments from China show that some AI models can achieve about 90% of the performance of larger proprietary models at roughly one-tenth the cost and with significantly less energy use. This efficiency is partly due to open-source approaches and on-device AI—where processing occurs locally on user devices rather than in centralized cloud platforms. This method reduces repetitive data transmission and excess energy consumption.
Second, while training new AI models requires substantial energy over several months, operating these models typically demands far less power once training is complete. As hardware improves and cooling technologies advance, overall efficiency increases. For instance, from 2007 to 2024, the ratio of cooling power needed per unit of server energy dropped from 1.5x to as low as 0.1x in some modern data centers.
Third, there is potential for AI itself to reduce broader energy usage across sectors outside data centers. Bill Gates has suggested that AI could lower global energy demand through efficiencies it enables elsewhere. For example, an AI-driven building management system in Kansas reduced a facility’s energy use by 16%, achieving a two-year payback period on investment. Since buildings represent nearly 40% of US energy usage according to government statistics, scaling such solutions could have significant effects.
Additionally, companies are using AI to optimize their own operations; one IT firm used AI to cut its data center cooling costs by about 40% through better temperature regulation.
"In conclusion, while concerns about AI’s rising energy demands are understandable, history and emerging evidence suggest that innovation and efficiency will likely keep consumption in check," said Adam Ross from Morgan Stanley Investment Management's 1GT Team. "Advances in hardware, smarter model design, and more effective cooling systems are already mitigating much of the projected strain."
"More importantly," Ross continued, "AI itself is proving to be a powerful tool for reducing energy use across sectors. Rather than overwhelming the grid, AI may ultimately become one of the most effective means of managing and reducing global energy demand."
He added: "Whatever occurs, we feel some caution is warranted when considering the energy investments behind the AI trend."
The article also includes risk considerations regarding alternative investments: they are speculative with high risks including loss of capital and illiquidity; investors should seek independent advice before making decisions.
Morgan Stanley Investment Management is part of Morgan Stanley’s asset management division.
