Image
ESG Calculations
AI infrastructure demands are reshaping energy grids and ESG priorities. From nuclear deals to water stress, here's what the data shows—and what to measure.

There’s no question that AI is reshaping industries, and fast. From customer service to climate modelling, machine learning is becoming embedded in how we work. But as generative AI scales up, so does the infrastructure that powers it.


Cue the headlines about multi-gigawatt data centres, new power corridors, and unprecedented compute demand. Tech giants are striking large nuclear power deals and financing new reactors to secure long-term, zero-carbon electricity for AI data centres (Read Article: Meta expands nuclear power ambitions to include Bill Gates’ startup), and even early-stage concepts like space-based data centres, which could tap near-continuous solar energy and the cold of space for cooling (Read Article: Data Centers in Space Aren’t as Wild as They Sound)


Which raises the question: What are the ESG implications of AI at scale?


1. Emissions and electricity use


AI workloads are energy-intensive. Training large models can require thousands of megawatt-hours of power. Inference (also known as running those models live) adds even more demand—and occurs at far greater scale, with impacts to match.

While estimates vary, by 2030, analysts suggest AI-related workloads could account for a low-single-digit share (around ~3%) of global electricity use, and a much larger share of data-centre demand, depending on efficiency gains and adoption rates. (Read article: Energy demand from AI)

For organisations aiming to reduce emissions, this presents a clear tension:
• Are AI workloads being powered by renewable energy?
• How will energy-intensive AI use impact net-zero pathways?
• In the TCO calculations, does setting up in countries with lagging regulatory reporting provide a financial incentive?


2. Water use for cooling
 

Data centres don’t just use electricity — large hyperscale data centres can consume hundreds of thousands to millions of litres of water per day for cooling, depending on design and location (Read Article: Data Centers and Water Consumption)


If AI-driven growth pushes a shift back to water-cooled hyperscale designs, sustainability teams will need to evaluate:

• Local water stress and basin availability

• Trade-offs between energy and water efficiency

• Whether water withdrawals are being reported under frameworks like GRI or CDP


3. Scope 3 and supply chain emissions
 

The embodied emissions of AI infrastructure, from GPUs to server racks, fall under Scope 3 and can be a major share of total lifecycle impact. The literature consistently notes that these upstream emissions are the hardest to measure and are often underreported. Challenges in this space include:


• The upstream impact of AI-capable hardware procurement

• Lifecycle emissions of AI deployments across the value chain

• Management of e-waste at the end of life or rapidly evolving technology

As regulations (such as ASRS and CSRD) increase pressure on Scope 3, expect greater scrutiny of AI’s supply chain impacts.


4. Social implications and governance
 

Beyond its environmental footprint, AI at scale poses social risks, including water and land-use pressures on nearby communities. Other considerations include:


• Labour conditions in mining rare earths for AI hardware

• Bias and fairness in automated decision systems

• Data privacy, consent, and transparency


While these may sit outside traditional ESG reporting, they increasingly shape stakeholder trust.


5. Innovation on the efficiency frontier

 

Not all AI is energy-hungry. A growing number of innovations are focused on reducing the footprint of AI through smarter design and infrastructure:


Renewable-powered AI infrastructure: Companies like Firmus Technologies are pioneering AI factories in Australia powered by clean energy and designed for high energy and water efficiency, seeking out locations with renewables available here and now.

Efficient AI models: Emerging architectures are dramatically reducing energy usage per task. Some studies show up to 98% savings compared to large general-purpose models.

Hardware-software co-design: New chips and processing techniques are being built from the ground up to optimise for efficiency and lower emissions.

Optical and photonic computing: Future-forward solutions using light instead of electricity to process data — promising greater speed and significantly lower energy use.

Cloud migration: Shifting AI workloads to efficient cloud platforms can cut carbon emissions by up to 94%.
These innovations benefit providers by reducing infrastructure costs, shrinking infrastructure footprint, and lowering ongoing energy and cooling costs. But will this, through sound sustainability practices such as reporting and net-zero targets, drive businesses and consumers to adopt more efficient AI? What exactly is the balance between the impacts and benefits of any given model?


6. What can providers do?


As organisations move to adopt AI, I think we will see a push from consumers of AI to the providers to ensure:

• Transparency on the impact of AI services — in essence, a product carbon footprint of the impacts of the queries, with a loading for the training of the models

• Reporting transparently on energy sourcing, emissions intensity, and water use. But with these being competitive advantages, open disclosure may be a challenge

• Offering options for low-carbon workloads (such as time-shifting compute to green energy windows — not all queries need to happen instantly)

• Aligning new builds with high-efficiency and renewable-powered design standards

• Investing in efficiencies and training. I know some great coders, and I like good code; good queries can reduce loads significantly.


Final thought


AI offers enormous potential. But like any powerful technology, it comes with trade-offs. ESG teams and the providers building the future have a role in shaping responsible infrastructure that aligns innovation with long-term climate and social goals. Some are already doing this (Read Article: How much energy does Google’s AI use? We did the math)


Stay tuned for Part 2 in that post, where I will focus on what AI consumers and enterprise users need to consider when it comes to tracking and managing their own AI-related emissions.

 

Michael Kasteel  
Director - ESG & Industry Solutions