With the world fixated on the latest artificial intelligence (AI) updates and new data centres springing up as quickly as companies can build them, there is a huge demand for power to run and cool the servers inside. Now concerns are mounting about whether the US (a global hub for data centres) can generate enough electricity for the widespread adoption of AI, and whether its ageing grid will be able to handle the load.
There are more than 8,000 data centres globally, with the highest concentration in the US, and courtesy AI, there will be far more by the end of the decade. Boston Consulting Group estimates demand for data centres will rise 15-20% annually through 2030, when they’re expected to comprise 16% of total US. power consumption. That’s up from just 2.5% before OpenAI’s ChatGPT was released in 2022, and it’s equivalent to the power used by about two-thirds of the total homes in the US.
“If we don’t start thinking about this power problem differently now, we’re never going to see this dream we have,” said Dipti Vachani, head of automotive at Arm. The chip company’s low-power processors have become increasingly popular with hyperscalers like Google , Microsoft , Oracle and Amazon — as they can reduce power use by up to 15% in data centres. Nvidia’s latest AI chip, Grace Blackwell, incorporates Arm-based CPUs it says can run generative AI models on 25 times less power than the previous generation.
One ChatGPT query uses nearly 10 times as much energy as a typical Google search, according to a report by Goldman Sachs. Generating an AI image can use as much power as charging a smartphone. This problem isn’t new. Estimates in 2019 found training one large language model produced as much CO2 as the entire lifetime of five gas-powered cars.
The hyperscalers building data centres to accommodate this massive power draw are also seeing emissions soar. Google’s latest environmental report showed greenhouse gas emissions rose nearly 50% from 2019-2023 in part because of data centre energy consumption, although it also said its data centres are 1.8 times as energy efficient as a typical data centre. Microsoft’s emissions rose nearly 30% from 2020-2024, also due in part to data centres. In Kansas City, where Meta is building an AI-focused data centre, power needs are so high that plans to close a coal-fired power plant are being put on hold.
“The industry itself is looking for places where there is either proximate access to renewables, either wind or solar, and other infrastructure that can be leveraged, whether it be part of an incentive programme to convert what would have been a coal-fired plant into natural gas, or increasingly looking at ways in which to offtake power from nuclear facilities,” said Jeff Tench, Silicon Valley-based Vantage Data Centre’s executive vice-president of North America and APAC.
OpenAI CEO Sam Altman has been vocal about this need. He recently invested in a solar startup that makes shipping-container-sized modules that have panels and power storage. Altman has also invested in nuclear fission startup Oklo which aims to make mini nuclear reactors housed in A-frame structures, and in the nuclear fusion startup Helion. Microsoft signed a deal with Helion last year to start buying its fusion electricity in 2028. Google partnered with a geothermal startup that says its next plant will harness enough power from underground to run a large data centre. Vantage recently built a 100-megawatt natural gas plant that powers one of its data centres in Virginia, keeping it entirely off the grid.
Yet another challenge is that generative AI data centres will also require 4.2-6.6bn cubic metres of water withdrawal by 2027 to stay cool, according to a research. That’s more than the total annual water withdrawal of half of the UK. So, in short, AI is an energy and water guzzler and that does not augur well for the environment in the long run unless remedial measures are deployed at the earliest.
Related Story