Tech
Water efficiency of English datacentres scrutinised in TechUK report | Computer Weekly

A report into commercial datacentres’ water usage in England suggests the sector is more efficient and less water-intensive than previously thought, thanks to advances in cooling technologies.
The survey, carried out by UK tech trade body TechUK in collaboration with the Environment Agency, set out to assess the environmental resources consumed by the datacentre industry in England, with a particular focus on water use.
TechUK gathered data from 73 sites across England, including more than 50 in the Water Resource South East region, and its findings showed that modern cooling systems are less reliant on potable water to keep servers from overheating than perhaps thought.
According to the results, 51% of surveyed sites use waterless cooling systems that require no additional water beyond the standard use of a commercial building. Out of those facilities that do use water, most employ hybrid systems combining air, water and refrigerant-based heat rejection, with only 5% relying entirely on water-based cooling.
These figures are significant because the datacentre industry has often been criticised for a lack of transparency around its environmental footprint. In fact, when compared with broader industrial consumption, datacentres account for only a small fraction of water use. The report notes that 64% of sites consumed levels of water similar to that of a Premier League football club over the course of a year.
One key conclusion is that datacentres have steadily become more water-efficient, largely due to technological innovation. Methods such as liquid cooling and direct-to-chip cooling are reducing or eliminating reliance on potable water. This trend is especially important as the UK government pushes for rapid expansion of datacentre capacity to meet the growing demands of AI-driven computing.
Luisa Cardani, head of the Datacentres Programme at TechUK, said further innovation in cooling is likely to continue. “A lot of the datacentre operators for newer facilities chose to move away from any water use where possible, and move to waterless cooling or hybrid systems,” she said. “That trend has continued because, as more and more data has become available around where there is water scarcity in England, they need to be efficient with their resources.”
The report also makes recommendations for government and industry, including the development of standardised but flexible cooling requirements for AI-ready servers. It calls for early coordination between datacentre developers, local authorities and water suppliers to ensure water demand is aligned with local supply capacity through clear connection agreements.
“Water companies would have this data. So, the question here is whether regulation is necessary,” Cardani added. “As our survey shows, a lot of these companies actually measure how much water they use, which itself is a very good thing, of course. As part of our recommendations, we call for all of the sector to do this.”
Richard Thompson, deputy director for water resources at the Environment Agency, said the report demonstrates that “UK datacentres are utilising a range of cooling technologies and becoming more water conscious”, adding: “It is vital the sector puts sustainability at its heart, and minimises water use in line with evolving standards. We are working with industry and other regulators to raise these to secure the best outcomes for our environment and our water supply for future generations.”
Despite its positive outlook, the report acknowledges its own limitations. The sample size of 73 sites represents only a fraction of the UK’s 477 datacentres, with all data provided voluntarily and without external validation. Most participating sites were located in Greater London and the South East, and the study focused only on large commercial facilities, excluding smaller operators.
According to Peter Judge, senior research analyst at Uptime Intelligence, this lack of transparency is no surprise. “Datacentre operators don’t really naturally give up information,” he said. “They’re operating in a world where they’re focused on their clients. Their clients expect a sort of level of privacy and so forth. Their default position is to not give information unless they absolutely have to. So, I think it will be forced upon them by legislation, rather than them doing it willingly.”
Judge argues that disclosure could ultimately benefit datacentre operators, particularly if they are classified as critical national infrastructure. “A lot of banking services and health services depend critically on datacentres, but you can’t say all datacentres are critical to the functioning of the country, some of them are simply storing personal videos.
“In other words, when legislation happens, it automatically has to demand information from the providers for there to be a benefit to being classified as critical national infrastructure, which might mean that you get exemptions from some of the energy efficiency or water usage demands.”
Uptime has previously criticised the sector for being overly secretive. “Datacentre operators have generally been too complacent, too secretive and when asked about environmental impact, they have been much too inclined to issue little lectures about how datacentres are really important, so we should all stop worrying,” Judge said.
He added that operators should engage more proactively with policymakers: “One of the things that Uptime is talking to operators about is the need to engage proactively with the people that are setting the legislation to try and make sure that the legislation is made with an actual understanding of how the sector works.”
Judge also warned that efficiency gains must be viewed in the context of rapid industry growth. “The industry likes to concentrate on efficiency rather than totals, but totals is how people set policies at the national level,” he said.
“If a big cloud provider improves the efficiency of its datacentres by 10%, but it has expanded the total capacity it’s using 10-fold in that time, it’s basically using 10 times the power, just with a little bit more efficiency.”
The government has already announced significant investment in expanding datacentre capacity across the UK by 2030.
Tech
What the US$55 billion Electronic Arts takeover means for video game workers and the industry

Electronic Arts (EA) is one of the world’s largest gaming companies. It has agreed to be acquired for US$55 billion in the second largest buyout in the industry’s history.
Under the terms, Saudi Arabia’s sovereign wealth fund (a state-owned investment fund), along with private equity firms Silver Lake and Affinity Partners, will pay EA shareholders US$210 per share.
EA is known for making popular gaming titles such as Madden NFL, The Sims and Mass Effect. The deal, US$20 billion of which is debt-financed, will take the company private.
The acquisition reinforces consolidation trends across the creative sector, mirroring similar deals in music, film and television. Creative and cultural industries have a “tendency for bigness,” and this is certainly a big deal.
It marks a continuation of large game companies being consumed by even larger players, such as Microsoft’s acquisition of Activision/Blizzard in 2023.
Bad news for workers
There is growing consensus that this acquisition is likely to be bad news for game workers, who have already seen tens of thousands of layoffs in recent years.
This leveraged buyout will result in restructuring at EA-owned studios. It adds massive debt that will need servicing. That will likely mean canceled titles, closed studios and lost jobs.
In their book “Private Equity at Work: When Wall Street Manages Main Street,” researchers Eileen Appelbaum and Rosemary Batt point to the “moral hazard” created when equity partners saddle portfolio companies with debt but carry little direct financial risk themselves.
The Saudi Public Investment Fund (PIF) is looking to increase its holdings in lucrative sectors of the game industry as part of its diversification strategy. However, private equity firms subscribe to a “buy to sell” model, focusing on making significant returns in the short term.
Appelbaum notes that restructuring opportunities are more limited when larger, successful companies—like EA—are acquired. In such cases, she says, “financial engineering is more common,” often resulting in “layoffs or downsizing to increase cash flow and service debt.”
Financial engineering combines techniques from applied mathematics, computer science and economic theory to create new and complex financial tools. The failed risk management of these tools has been implicated in financial scandals and market crashes.
Financialization and the fissured workplace
The financialization of the game industry is a problem. Financialization refers to a set of changes in corporate ownership and governance—including the deregulation of financial markets—that have increased the influence of financial companies and investors.
It has produced economies where a considerable share of profits comes from financial transactions rather than the production and provision of goods and services.
It creates what American management professor David Weil calls a “fissured workplace” where ownership models are multi-layered and complex.
It gives financial players an influential seat at the corporate decision-making table and directs managerial attention toward investment returns while transferring the risks of failure to the portfolio company.
As a result, game titles, jobs and studios can be easily shed when financial companies restructure to increase dividends, leaving workers with little access to these financial players as accountable employers.
Chasing incentives and cutting costs
The Saudi PIF has stated a goal of creating 1.8 million “direct and indirect jobs” to stimulate the Saudi economy. But capital is mobile, and game companies will likely follow jurisdictions that have lower wages, fewer labor protections and significant tax incentives.
Some Canadian governments are working to keep studios and creative jobs closer to home. British Columbia recently increased its interactive media tax credit to 25%.
The move was welcomed by the chief operations officer of EA Vancouver, who said “B.C.”s continued commitment to the interactive digital media sector…through enhancements to the … tax credit … reflects the province’s recognition of the industry’s value and enables companies like ours to continue contributing to B.C.”s creative and innovative economy.”
This may buffer Vancouver’s flagship EA Sports studio, but those making less lucrative games or in regions without financial subsidies will be more at risk of closure, relocation or sale. Alberta-based Bioware—developer of games including Dragon Age and Mass Effect—could be at risk.
Other ways of aggressively cutting costs might come in the form of increased AI use. EA was called out in 2023 for saying AI regulation could negatively impact its business. Yet creative stagnation and cutting corners through AI will negatively impact the number of jobs, the quality of jobs and the quality of games. That could be a larger threat to EA’s business and reinforce a negative direction for the industry.
Game players have low tolerance for quality shifts and predatory monetization strategies. Research shows that gamers see acquisitions negatively: development takes longer, innovation is curtailed and creativity is stymied.
Consolidation among industry giants may cause players to lose faith in EA’s product—and games in general, given the many other entertainment options that are available.
Creative control and worker power at risk
Some have raised concerns that the acquisition could affect EA’s creative direction and editorial decisions, potentially leading to increased content restrictions.
While it’s still unclear how the deal will influence EA’s output, experiences in other industries might be a sign of things to come. For instance, comedians reportedly censored themselves to perform in Saudi Arabia.
The acquisition may also have a chilling effect on the workers’ unionization movement. Currently, no EA studios in Canada are unionized. Outsourced quality assurance workers at the EA-owned BioWare Studio in Edmonton successfully certified a union in 2022, but were subsequently laid off. Fears of outsourcing, layoffs and restructuring could discourage future organizing efforts.
On the other hand, the knowledge that large financial players are making massive profits could galvanize workers, especially considering that before the buyout, EA CEO Andrew Wilson was paid about 264 times the salary of the median EA employee.
The deal certainly does nothing to bring stability to an already volatile industry. Regardless of any cash injection, EA remains very exposed.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Citation:
What the US$55 billion Electronic Arts takeover means for video game workers and the industry (2025, October 21)
retrieved 21 October 2025
from https://techxplore.com/news/2025-10-us55-billion-electronic-arts-takeover.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Tech
An Amazon outage has rattled the internet. A computer scientist explains why the ‘cloud’ needs to change

The world’s largest cloud computing platform, Amazon Web Services (AWS), has experienced a major outage that has impacted thousands of organizations, including banks, financial software platforms such as Xero, and social media platforms such as Snapchat.
The outage began at roughly 6pm AEDT on Monday. It was caused by a malfunction at one of AWS’ data centers located in Northern Virginia in the United States. AWS says it has fixed the underlying issue but some internet users are still reporting service disruptions.
This incident highlights the vulnerabilities of relying so much on cloud computing—or “the cloud” as it’s often called. But there are ways to mitigate some of the risks.
Renting IT infrastructure
Cloud computing is the on-demand delivery of diverse IT resources such as computing power, database storage, and applications over the internet. In simple terms, it’s renting (not owning) your own IT infrastructure.
Cloud computing came into prevalence with the dot com boom in the late 1990s, wherein digital tech companies started to deliver software over the internet. As companies such as Amazon matured in their own ability to offer what’s known as “software as a service” over the web, they started to offer others the ability to rent their virtual servers for a cost as well.
This was a lucrative value proposition. Cloud computing enables a pay-as-you-go model similar to a utility bill, rather than the huge upfront investment required to purchase, operate and manage your own data center.
As a result, the latest statistics suggest more than 94% of all enterprises use cloud-based services in some form.
A market dominated by three companies
The global cloud market is dominated by three companies. AWS holds the largest share (roughly 30%). It’s followed by Microsoft Azure (about 20%) and Google Cloud Platform (about 13%).
All three service providers have had recent outages, significantly impacting digital service platforms. For example, in 2024, an issue with third-party software severely impacted Microsoft Azure, causing extensive operational failures for businesses globally.
Google Cloud Platform also experienced a major outage this year due to an internal misconfiguration.
Profound risks
The heavy reliance of the global internet on just a few major providers—AWS, Azure, and Google Cloud—creates profound risks for both businesses and everyday users.
First, this concentration forms a single point of failure. As seen in the latest AWS event, a simple configuration error in one central system can trigger a domino effect that instantly paralyzes vast segments of the internet.
Second, these providers often impose vendor lock-in. Companies find it prohibitively difficult and expensive to switch platforms due to complex data architectures and excessively high fees charged for moving large volumes of data out of the cloud (data egress costs). This effectively traps customers, leaving them hostage to a single vendor’s terms.
Finally, the dominance of US-based cloud service providers introduces geopolitical and regulatory risks. Data stored in these massive systems is subject to US laws and government demands, which can complicate compliance with international data sovereignty regulations such as Australia’s Privacy Act.
Furthermore, these companies hold the power to censor or restrict access to services, giving them control over how firms operate.
The current best practice to mitigate these risks is to adopt a multi-cloud approach that enables you to decentralize. This involves running critical applications across multiple vendors to eliminate the single point of failure.
This approach can be complemented by what’s known as “edge computing“, wherein data storage and processing is moved away from large, central data centers, toward smaller, distributed nodes (such as local servers) that firms can control directly.
The combination of edge computing and a multi-cloud approach enhances resilience, improves speed, and helps companies meet strict data regulatory requirements while avoiding dependence on any single entity.
As the old saying goes, don’t put all of your eggs in one basket.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Citation:
An Amazon outage has rattled the internet. A computer scientist explains why the ‘cloud’ needs to change (2025, October 21)
retrieved 21 October 2025
from https://techxplore.com/news/2025-10-amazon-outage-rattled-internet-scientist.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Tech
Sperm From Older Men Have More Genetic Mutations

Human semen not only accumulates genetic mutations with age; as the percentage of sperm carrying potentially serious mutations increases, so does the risk of developing diseases in offspring.
This is according to a new study by researchers at the Sanger Institute and King’s College London. The team sequenced semen samples from individuals between the ages of 24 and 75, using very high-precision technologies, and found that the male germ line (the line of cells that produce sperm) is subject to a combination of mutation and positive selection.
The scientists used a duplex sequencing technique called NanoSeq, which can detect rare mutations with a very low margin of error. This allowed them to analyze 81 sperm samples from 57 donors. The results showed that a man’s sperm adds an average of 1.67 new mutations every year.
But the most striking aspect of the study is not limited to the mere accumulation of mutations with age. The authors discovered that the male germ line is subject to positive selection. That is, certain mutations offer an advantage to cells that produce sperm and expand. They identified that many of these mutations are in genes related to developmental disorders or a predisposition to childhood cancer.
“We expected to find evidence that selection influences mutations in sperm,” said Matthew Neville, coauthor of the study published this month in the journal Nature. “What surprised us was how much the number of sperm carrying mutations associated with serious diseases increases.”
What Does This Mean for Children of Older Fathers?
The researchers estimated that about 3 to 5 percent of sperm from middle-aged and older men carry some potentially pathogenic mutation in the exome (the coding part of the genome). That represents a higher risk than previous estimates. In more concrete numbers, the estimated fraction for men in their thirties was close to 2 percent, while it reached about 4.5 percent for men in their seventies.
From the evolutionary and clinical perspective, the implications are significant. Evolutionarily, it shows that the male germ line is not simply a “machine” that accumulates errors: There is a dynamic process of mutation and selection that can modify the genetic “quality” of the sperm with the age of the father.
On the clinical side, however, it raises questions about reproductive planning, genetic counseling, and the additional risks associated with an older father. The authors argue that although the percentages remain modest, the the accumulation is not only linear but also has a selection component that favors mutations with the potential to spread.
-
Tech1 week ago
UK police to upgrade illicit asset recovery system | Computer Weekly
-
Tech5 days ago
Why the F5 Hack Created an ‘Imminent Threat’ for Thousands of Networks
-
Tech7 days ago
What Is Google One, and Should You Subscribe?
-
Tech2 days ago
How to Protect Yourself Against Getting Locked Out of Your Cloud Accounts
-
Tech1 week ago
Massive UK dieselgate lawsuit reaches court
-
Fashion1 week ago
US brand Ralph Lauren reports 2025 sustainability progress
-
Tech1 week ago
When does it pay for housing associations to replace water and sewage pipes?
-
Business6 days ago
Baroness Mone-linked PPE firm misses deadline to pay £122m