ARIA Commodities

News and Views

The two sided coin that is Generative AI

April 10 2024 I News and Views

Artificial intelligence (AI) is set to revolutionise the energy management of data centres, according to a flurry of articles from industry commentators. Given the This advancement marks a significant leap towards enhancing efficiency and sustainability within one of the most critical components of modern infrastructure—the data centre energy grid.

With the energy transition, data has never been a more complex challenge for utilities. It’s bigger, generates faster, and is more dispersed than ever before. And yet, underneath all these complexities, data has also never held more opportunity for utilities. The key is to harness it. More accessible energy data can help utilities embrace change to optimize resources, manage disruptions, and enable renewables at scale. But for many utilities, that’s easier said than done. Energy data often sits in IT, OT, and external silos, making it difficult for applications to leverage this energy data and for operators to access it.

Grid data is what data scientists call “distributed” in nature. In other words, grid data is generated and stored all over utility organizations, with various business units, applications, and functions owning their own unique data sets. Data may also come from outside the organization, as in the case of weather and wildfire data. During the 2010s, many utilities attempted to bridge silos and resolve the problem by “centralizing” their grid data in data warehouses or data lakes. Many of these attempts struggled – simply because grid data is distributed by nature, and not easily centralized. The side effect of these centralization attempts was that data owners lost control of their data, and quality and accuracy suffered as a result. Critical grid processes like grid orchestration, AI- and ML-powered technologies, and advanced grid control applications need high-quality data to work effectively. Thus, most grid data remains distributed – and by extension, difficult to centralize.

But despite the challenges of identifying, collecting, and leveraging energy data, there’s no questioning its crucial importance to the modern utility. Within this data lies the potential for utilities to unlock data-driven decision making, grid automation, and intelligent use cases. This means that utilities absolutely must find a solution that enables them to access and utilize any type of data, exactly when they need it. The ideal solution must be able to:

  • Bring together disparate data sources from across the enterprise to provide a unified view
  • Bridge data silos at any level
  • Respect the distributed nature of grid data (i.e. not try to centralise it)
  • Enable data-driven automation and coordinated decision making across transmission, distribution, and the edge to manage multi-directional energy flows.

One of the major advantages of using AI in energy management is its ability to enhance the efficiency of power grids within data centres. Traditional methods of managing energy consumption often involve static, manual adjustments that fail to account for the complex and fluctuating nature of data centre operations. AI, on the other hand, provides a more adaptive and proactive approach, allowing for continuous optimisation and real-time adjustments. This not only leads to significant cost savings but also contributes to a reduction in the carbon footprint of data centres.

So whilst AI is likely to be a significant part of organising grid capacity more efficient going forwards, it will inevitably lead to further complications in its own right.

However, to talk to utilities and data-centre operators is to hear contrary perspectives. Whilst many share the excitement about artificial intelligence (AI), they are grappling with an energy conundrum on which the future of three big economic shifts partly hinges: the AI revolution; the efforts to electrify swathes of the economy; and the fight against climate change.

All Consuming Appetites

In a nutshell, “generative” AI, the sort behind OpenAI’s ChatGPT, has a ravenous appetite for electricity. It has landed, virtually out of the blue, on a global energy system that is already struggling to cope with alternative sources of power demand. As yet it is not clear whether there will be enough clean energy to meet everyone’s needs.

At first glance, the solution looks simple. Data centres, such as those that companies like Alphabet, Amazon and Microsoft use to supply cloud-computing services, have over the past decade or so accounted for only 1-2% of global energy demand. For years the big-tech “hyperscalers” have harvested ever greater energy efficiencies from their server farms, even as the world’s computing workloads have soared. Moreover, they have invested heavily in clean energy to offset their carbon footprints. In America, electricity providers to the hyperscalers are only too keen to help. They have endured two decades of anaemic electricity demand and are desperate for new sources of growth. In recent earnings calls their bosses have promised tens of billions of dollars in investment over the next five years to pump more power to data centres. Last month one such firm, Talen Energy, sold Amazon a nuclear-powered data centre for $650m. So far, so promising

Generative AI changes the nature of the game, though. Since the days when they were the workhorses of the cryptocurrency boom, graphics-processing units (GPUs), the chips on which models like ChatGPT are trained and run, have been energy addicts. According to Christopher Wellise of Equinix, which rents out data centres, a pre-AI hyperscale server rack uses 10-15 kilowatts (kw) of power. 

Game Changer

It is early days in the generative-AI boom, so it is too soon to make hard and fast predictions. But informed guesses about the related rise in energy demand are striking. At the top of its range, the International Energy Agency, a global forecaster, says that by 2026 data centres could use twice as much energy as two years ago—and as much as Japan consumes today.

In America, two things further compound the complexities. The first is timing. The rise of generative AI coincides with a booming economy, with power consumption to match. Many power consumers want their energy to be zero-carbon, creating competition for a scarce resource. So do buyers of electric vehicles (EVs), the rise of which may have slowed but has not stopped. The second is the challenge of expanding the grid. Despite support from the White House, it is not easy for utilities to build new renewable capacity quickly. They suffer from supply-chain problems; by some accounts it takes three years to deliver a transformer, up from less than a year previously.

If shortages of renewable energy occur, it will come at a cost. No one knows yet how generative AI will make money. What people do know is that the cost of acquiring GPUs is rocketing. If the energy costs of running them soar, too, it could put the brakes on expansion. In addition, the electrification of the rest of the economy is highly cost-dependent; an AI v EV scramble for clean power would push up prices and serve neither industry well. The human ingenuity that created EV’s, generative AI and the grid in the first instance, will be needed once more to allow AI to become part of the solution, rather than part of the problem.