Connect with us

Tech

Data centers consume massive amounts of water. Companies rarely tell the public exactly how much

Published

on

Data centers consume massive amounts of water. Companies rarely tell the public exactly how much


Credit: Unsplash/CC0 Public Domain

As demand for artificial intelligence technology boosts construction and proposed construction of data centers around the world, those computers require not just electricity and land, but also a significant amount of water. Data centers use water directly, with cooling water pumped through pipes in and around the computer equipment. They also use water indirectly, through the water required to produce the electricity to power the facility. The amount of water used to produce electricity increases dramatically when the source is fossil fuels compared with solar or wind.

A 2024 report from the Lawrence Berkeley National Laboratory estimated that in 2023, U.S. data centers consumed 17 billion gallons (64 billion liters) of water directly through cooling, and projects that by 2028, those figures could double—or even quadruple. The same report estimated that in 2023, U.S. data centers consumed an additional 211 billion gallons (800 billion liters) of water indirectly through the electricity that powers them. But that is just an estimate in a fast-changing industry.

We are researchers in water law and policy based on the shores of Lake Michigan. Technology companies are eyeing the Great Lakes region to host data centers, including one proposed for Port Washington, Wisconsin, which could be one of the largest in the country. The Great Lakes region offers a relatively cool climate and an abundance of water, making the region an attractive location for hot and thirsty data centers.

The Great Lakes are an important, binational resource that more than 40 million people depend on for their drinking water and supports a US$6 trillion regional economy. Data centers compete with these existing uses and may deplete local groundwater aquifers.

Our analysis of public records, and sustainability reports compiled by top data center companies has found that don’t always reveal how much water their data centers use. In a forthcoming Rutgers Computer and Technology Law Journal article, we walk through our methods and findings using these resources to uncover the water demands of data centers.

In general, corporate sustainability reports offered the most access and detail—including that in 2024, one data center in Iowa consumed 1 billion (3.8 billion liters) gallons of water—enough to supply all of Iowa’s residential water for five days.

How do data centers use water?

The servers and routers in data centers work hard and generate a lot of heat. To cool them down, data centers use large amounts of water—in some cases over 25% of local community water supplies. In 2023, Google reported consuming over 6 billion gallons of water (nearly 23 billion liters) to cool all its data centers.

In some data centers, the water is used up in the cooling process. In an evaporative cooling system, pumps push cold water through pipes in the data center. The cold water absorbs the heat produced by the data center servers, turning into steam that is vented out of the facility. This system requires a constant supply of cold water.

In closed-loop cooling systems, the cooling process is similar, but rather than venting steam to the air, air-cooled chillers cool down the hot water. The cooled water is then recirculated to cool the facility again. This does not require constant addition of large volumes of water, but it uses a lot more energy to run the chillers. The actual numbers showing those differences, which likely vary by the facility, are not publicly available.

One key way to evaluate is the amount of water that is considered “consumed,” meaning it is withdrawn from the local water supply and used up—for instance, evaporated as steam—and not returned to its source.

For information, we first looked to government data, such as that kept by municipal water systems, but the process of getting all the necessary data can be onerous and time-consuming, with some denying data access due to confidentiality concerns. So we turned to other sources to uncover data center water use.

Sustainability reports provide insight

Many companies, especially those that prioritize sustainability, release publicly available reports about their environmental and sustainability practices, including water use. We focused on six top tech companies with data centers: Amazon, Google, Microsoft, Meta, Digital Realty and Equinix. Our findings revealed significant variability in both how much water the companies’ data centers used, and how much specific information the companies’ reports actually provided.

Sustainability reports offer a valuable glimpse into data center water use. But because the reports are voluntary, different companies report different statistics in ways that make them hard to combine or compare. Importantly, these disclosures do not consistently include the indirect water consumption from their electricity use, which the Lawrence Berkeley Lab estimated was 12 times greater than the direct use for cooling in 2023. Our estimates highlighting specific water consumption reports are all related to cooling.

Amazon releases annual sustainability reports, but those documents do not disclose how much water the company uses. Microsoft provides data on its water demands for its overall operations, but does not break down water use for its data centers. Meta does that breakdown, but only in a companywide aggregate figure. Google provides individual figures for each data center.

In general, the five companies we analyzed that do disclose water usage show a general trend of increasing direct water use each year. Researchers attribute this trend to data centers.

A closer look at Google and Meta

To take a deeper look, we focused on Google and Meta, as they provide some of the most detailed reports of data center water use.

Data centers make up significant proportions of both companies’ water use. In 2023, Meta consumed 813 million gallons of water globally (3.1 billion liters)—95% of which, 776 million gallons (2.9 billion liters), was used by data centers.

For Google, the picture is similar, but with higher numbers. In 2023, Google operations worldwide consumed 6.4 billion gallons of water (24.2 billion liters), with 95% or 6.1 billion gallons (23.1 billion liters) used by data centers.

Google reports that in 2024, the company’s data center in Council Bluffs, Iowa, consumed 1 billion gallons of water (3.8 billion liters), the most of any of its .

The Google data center using the least that year was in Pflugerville, Texas, which consumed 10,000 gallons (38,000 liters)—about as much as one Texas home would use in two months. That data center is air-cooled, not water-cooled, and consumes significantly less water than the 1.5 million gallons (5.7 million liters) at an air-cooled Google data center in Storey County, Nevada. Because Google’s disclosures do not pair water consumption data with the size of centers, technology used or indirect water consumption from power, these are simply partial views, with the big picture obscured.

Given society’s growing interest in AI, the industry will likely continue its rapid expansion. But without a consistent and transparent way to track water consumption over time, the public and government officials will be making decisions about locations, regulations and sustainability without complete information on how these massive companies’ hot and thirsty buildings will affect their communities and their environments.

Provided by
The Conversation


This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation:
Data centers consume massive amounts of water. Companies rarely tell the public exactly how much (2025, August 19)
retrieved 19 August 2025
from https://techxplore.com/news/2025-08-centers-consume-massive-amounts-companies.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Tech

Study examines whether policy intervention could combat ransomware

Published

on

Study examines whether policy intervention could combat ransomware


Credit: Pixabay/CC0 Public Domain

As ransomware attacks become more common and complex—and costly to the crimes’ targets—a University of Texas at Dallas researcher is examining how policymakers might combat cybercriminals.

Dr. Atanu Lahiri, an associate professor of information systems at the Naveen Jindal School of Management, said ransomware has become one of the top cybersecurity threats facing organizations worldwide. Spread primarily through email phishing scams and exploitation of unpatched software bugs, ransomware robs a user’s access to computer files until a ransom is paid.

“The data is still on your computer,” he said. “It’s locked up, and the criminals have the key.”

In a study published in Information Systems Research, Lahiri and a colleague examined whether and under what circumstances policy intervention could help deter this type of cyberattack. He found that effective response solutions might depend on factors such as the value of compromised information, the nature of the ransom demand, and who or what organization is most affected.

Although paying ransom often seems preferable to facing business disruptions, payments also embolden the attackers and encourage them to come back for more. This ripple effect, or externality, which is driven by extortion, creates a unique problem dubbed “extortionality” by the authors.

“There are two questions: When do we care, and what do we do?” Lahiri said. “Should ransom payments be banned or even penalized?”

The disruptions caused by can be crippling for businesses. In 2024, the FBI’s Internet Crime Complaint Center received more than 3,000 ransomware complaints. Victims paid over $800 million to attackers, according to research by Chainalysis, although the impact is likely much higher because many incidents and payments go unreported.

The illegal breaches have hit targets ranging from Fortune 500 companies to police departments to government and university systems.

Lahiri was inspired to explore potential solutions as federal and state lawmakers grapple with laws to restrict government entities and other companies from paying ransoms to regain access to their data. He found that fighting these threats through legislation is tricky because a ban on ransom payments or other penalties could negatively affect the victim, whose goal is simply to recover compromised information quickly and with minimal disruption.

For example, outright bans on ransom payment are particularly problematic for hospitals, where lives are at stake and critical lifesaving information can’t be accessed.

On the other hand, paying ransom rewards criminal behavior, encourages more breaches and elevates the risk of additional attacks, the researchers found.

Through mathematical models and simulations, Lahiri determined that an ideal scenario in many cases would be for companies not to give in to an attacker’s ransom demand. In practice, however, this solution is not so clear-cut.

“It relies on you trusting the other guy, in this case other organizations, not to pay up either,” he said. “It would be better if nobody paid, but if someone does, it would raise the risk for everybody.”

“You have to be careful when you impose a ban, though,” said Lahiri, who teaches the graduate class Cybersecurity Fundamentals at UT Dallas, serves as director of the cybersecurity systems certificate program, and chairs the University Information Security Advisory Committee. “A more reasoned approach might be to first try incentives or a penalty to deter ransom payments.”

If the attackers are not strategic in choosing their ransom asks—and do not demand different sums from the victims depending on their ability to pay—Lahiri recommends that policymakers impose fines or taxes on companies that pay ransoms.

“When imposing a ban, policymakers should be mindful,” he said. “In particular, hospitals and critical infrastructure firms should be exempted to avoid excessive collateral damage from business disruption.

“In some cases, you wouldn’t even have to impose the ban, but if you talk a lot about a ban, ransom payers would take notice. Even the specter of a ban might do the trick and make organizations invest in backup technologies that can help them recover without having to pay the attackers.”

The best offense, Lahiri said, is a good defense, and the is simply more redundancy. Backing up data and practicing drills on recovering information is a strong way to avoid paying the attacker. Policymakers could incentivize redundancy measures, he said, by subsidizing backup technology, practice drills and awareness campaigns.

“One of the biggest problems is that people don’t invest in backups,” Lahiri said. “They don’t conduct drills, like fire drills. Security is always seen as a hassle.

“If we had great backups and we could recover from the attacks, we would not be paying the ransom in the first place. And we would not be talking about extortionality.”

Dr. Debabrata Dey, Davis Professor and area director of analytics, information and operations at the University of Kansas, is a co-author of the study.

More information:
Debabrata Dey et al, “Extortionality” in Ransomware Attacks: A Microeconomic Study of Extortion and Externality, Information Systems Research (2025). DOI: 10.1287/isre.2024.1160

Citation:
Study examines whether policy intervention could combat ransomware (2025, August 28)
retrieved 28 August 2025
from https://techxplore.com/news/2025-08-policy-intervention-combat-ransomware.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Continue Reading

Tech

Manufacturas Eliot boosts digital shift with Coats Digital’s VisionPLM

Published

on

Manufacturas Eliot boosts digital shift with Coats Digital’s VisionPLM



Coats Digital is pleased to announce that that Manufacturas Eliot, one of Colombia’s leading fashion textile groups, has selected VisionPLM to advance its digital transformation strategy. The solution will optimise product lifecycle management across its portfolio of brands—Patprimo, Seven Seven, Ostu, and Atmos—enhancing collaboration, streamlining operations, and enabling greater speed to market.

Manufacturas Eliot, a Colombian fashion group, has selected Coats Digital’s VisionPLM to boost digital transformation across its brands.
The platform will enhance collaboration, speed up product development, and streamline operations.
VisionPLM aims to improve agility, traceability, and decision-making, supporting Eliot’s drive for innovation and sustainable growth.

Founded in 1957, Manufacturas Eliot is a vertically integrated manufacturer producing over 20 million garments annually. Renowned for delivering high-quality, accessible fashion, the group continues to invest in technologies that support sustainable growth and operational excellence.

The implementation of VisionPLM demonstrates Elliot’s strong commitment to end-to-end digitalisation across the value chain. By introducing VisionPLM, Eliot aims to improve product development agility, reduce time-to-market, and ensure seamless communication across cross-functional teams.

Juliana Pérez, Design Director, Seven Seven, commented: “From the design team’s point of view, we’re really excited about implementing VisionPLM, as it will allow us to manage our collections in a more structured way and collaborate efficiently with other departments.”

Angela Quevedo, Planning Director,  Manufacturas Eliot, added: “VisionPLM will significantly improve the planning and coordination of our operations by enabling a more accurate flow of information and reducing response times across the supply chain. It will also help us optimise processes and accelerate decision-making.”

Tailored specifically for the fashion industry, VisionPLM integrates tools that boost development speed, improve traceability, and enhance decision-making. By centralising design, sourcing, and supplier collaboration in one digital platform, the solution enables a streamlined, transparent, and responsive approach to managing collections.

Oscar González, Coats Digital – LATAM, said: “We’re proud to continue supporting Manufacturas Eliot on its digital transformation journey. The adoption of VisionPLM marks a key milestone in advancing its fashion innovation strategy—enabling faster, smarter decision-making and more agile collaboration across teams and suppliers. Its helping to build a future-ready, connected operation that’s fully aligned to the demands of today’s fashion market.”

Note: The headline, insights, and image of this press release may have been refined by the Fibre2Fashion staff; the rest of the content remains unchanged.

Fibre2Fashion News Desk (HU)



Source link

Continue Reading

Tech

Top CDC Officials Resign After Director Is Pushed Out

Published

on

Top CDC Officials Resign After Director Is Pushed Out


Susan Monarez is no longer the director of the US Centers for Disease Control and Prevention, according to a post by the official Department of Health and Human Services X account. She had been in the position for just a month. In the wake of her apparent ouster, several other CDC leaders have resigned.

Named acting CDC director in January, Monarez was officially confirmed to the position by the Senate on July 29 and sworn in two days later. During her brief tenure, the CDC’s main campus in Atlanta was attacked by a gunman who blamed the Covid-19 vaccine for making him sick and depressed. A local police officer, David Rose, was killed by the suspect when responding to the shooting.

In a statement Wednesday evening Mark Zaid and Abbe David Lowell, Monarez’s lawyers, alleged that she had been “targeted” for refusing “to rubber-stamp unscientific, reckless directives and fire dedicated health experts.” The statement further says that Monarez has not resigned and does not plan to, and claims that she has not received notification that she’s been fired.

According to emails obtained by WIRED, at least three other senior CDC officials resigned Wednesday evening: Demetre Daskalakis, director of the National Center for Immunization and Respiratory Diseases; Debra Houry, chief medical officer and deputy director for program and science; and Daniel Jernigan, director of the National Center for Emerging and Zoonotic Infectious Diseases.

More resignations are expected to become public soon, say CDC with knowledge of the departures.

“I worry that political appointees will not make decisions on the science, but instead focus on supporting the administration’s agenda,” says one CDC employee, who was granted anonymity out of concerns over retribution. “I worry that the next directors will not support and protect staff.”

President Donald Trump’s original pick to lead the CDC was David Weldon, a physician and previous Republican congressman from Florida who had a history of making statements questioning the safety of vaccines. But hours before his Senate confirmation hearing in March, the White House withdrew Weldon’s nomination. The administration then nominated Monarez.

The CDC leadership exits come amid recent vaccine policy upheaval by HHS secretary Robert F. Kennedy Jr., who in May removed the Covid-19 vaccine from the list CDC’s recommended vaccines for healthy children and pregnant women. The following month, he fired all 17 sitting members of the CDC’s Advisory Committee on Immunization Practices, a group of independent experts that makes science-based recommendations on vaccines.

In their place, he installed eight new members, including several longtime vaccine critics. “A clean sweep is necessary to reestablish public confidence in vaccine science,” Kennedy said in a statement at the time.

Earlier this month under Kennedy’s leadership, HHS canceled a half billion dollars in funding for research on mRNA vaccines. This month HHS also announced the reinstatement of the Task Force on Safer Childhood Vaccines, a federal advisory panel created by Congress in 1986 to improve vaccine safety and oversight for children in the US. The panel was disbanded in 1998, when it issued its final report. Public health experts worry that the panel is a move to further undermine established vaccine science.



Source link

Continue Reading

Trending