Tech
AI, energy, and the new rules of cloud sustainability competition | Computer Weekly
The cloud industry has long promoted a reassuring sustainability narrative: Hyperscalers are more efficient and renewables for energy are growing, which means shifting workloads off‑premises and reduces emissions.
But how “green” is a specific cloud workload, in a specific region, at a specific time?
For most enterprise buyers, that comparison remains frustratingly hard. Amazon Web Services (AWS), Microsoft, and Google all publish sustainability data and customer-facing emissions tools, but they disclose different things, at different levels of granularity, using different methods. That makes direct comparisons difficult just as cloud becomes the default destination for AI workloads.
Similar metrics on the surface: Different workload details
Hyperscalers’ public disclosures reveal a shared narrative. All three talk about energy efficiency, carbon-free or renewable electricity, customer carbon-accounting tools, water usage, and embodied carbon.
AWS says Amazon matched 100% of the electricity it consumed with renewable energy in 2024, reported a global power usage effectiveness (PUE) of 1.15 and a water usage effectiveness (WUE) of 0.15 liters per kilowatt-hour (L/kWh), and now offers customer emissions visibility by scope, region, and service through its Customer Carbon Footprint Tool.
Microsoft’s reporting for fiscal year ’25 shows a global PUE of 1.17 and a WUE of 0.27 L/kWh for the data centres it fully owns and controls, and also offers customer emissions tracking through its Emissions Impact Dashboard for Azure and Microsoft 365.
Google provides both market-based and location-based emissions data across all three scopes through its Google Cloud Carbon Footprint and publishes regional carbon-free energy indicators and grid carbon intensity data to help customers evaluate location choices.
The industry, however, still lacks a standardised, apples‑to‑apples view of workload‑level sustainability across cloud providers. Native tools offered by cloud providers continue to improve, but they do not yet enable enterprises to easily compare the emissions profile of the same AI or cloud workload across Amazon Web Services, Microsoft Azure, and Google Cloud using a common framework.
Providers now give embodied carbon more attention, yet customers still rarely receive precise, comparable allocations of emissions from servers, GPUs, networking equipment, storage, and buildings.
Cloud providers also disclose water data unevenly. They may publish high‑level water usage effectiveness metrics but typically do not show the customer‑attributable water burden of workloads running in water‑stressed regions.
AI is making these gaps impossible to ignore. As AI receives unprecedented investments across industries, executive scrutiny around AI costs and outcomes is tightening, and the same scrutiny is emerging for environmental impact.
Microsoft says that as demand for AI and cloud services grows, it is redesigning data centers with direct-to-chip cooling and next-generation facilities that can avoid more than 125 million litres of water evaporation per facility each year.
AWS says it is building data centres for the next generation of AI workloads and highlights the efficiency benefits of its own silicon, including Graviton and Trainium.
Google’s 2025 Environmental Report is explicitly centered on energy, AI, and resilience, and its Carbon Footprint documentation now incorporates AI inference emissions for cloud AI services such as Vertex AI and Document AI.
What used to be a reputational issue is becoming an infrastructure accounting issue.
Community impact touted As improving, but not comparable either
There is a second, equally important dimension to this story: the community impact of data centres. AI data centres are deeply embedded in local contexts, drawing electricity from regional grids, occupying land in real communities, and often consuming water in areas already under stress.
Microsoft has increasingly linked its data centre expansion to community stewardship, citing initiatives such as the Quincy Water Reuse Utility, which it says reduced regional potable water use by 97% while supplying 1.5 million cubic meters of water annually for local drinking needs.
Amazon Web Services reports that its water replenishment programs are expected to return more than 18 billion litres per year to local communities, alongside renewable energy investments that support jobs and local economic development.
A 2025 investigation by The Guardian found that Amazon, Microsoft, and Google were operating or developing data centres in an expanding number of water‑scarce regions as AI and cloud demand rose. As a result, credible sustainability disclosures must now address not only carbon efficiency but also the local environmental and social consequences of digital infrastructure growth.
Sustainability transparency as a minimum purchasing standard
What should enterprise buyers do?
First, stop treating cloud sustainability as a secondary sustainability afterthought, and start treating it as part of core architecture and sourcing governance.
That means requiring market-based and location-based emissions data, requiring reporting by service, region, project, and month, and demanding visibility into embodied carbon, AI-specific energy use, and water intensity.
Second, don’t rely exclusively on native dashboards. Third-party comparison tools exist because enterprises need normalisation across clouds, not just better single-provider views.
Finally, write these requirements directly into cloud and AI RFPs, rather than leave them as optional side conversations after contracts are signed.
The cloud market is not short on climate ambition, glossy dashboards, or sustainability slogans. What it still lacks is anything buyers can reliably compare and use for procurement decisions.
AI has made cloud infrastructure core to enterprise architecture — more valuable, strategic, and resource-intensive. It has also made vague sustainability claims less defensible.
The next phase of competition among hyperscalers will not be won only on model access, who can secure enough GPUs, or price performance. Competition will expand beyond raw technical advantage and pricing into other dimensions such as sustainability transparency, infrastructure resilience, and credibility. It will also be shaped by the providers that can show what their platforms cost the communities around them.
For the fast-moving enterprise buyer juggling a multitude of variables in scaling AI, sustainability transparency is quickly becoming a minimum standard.