The EU will keep enforcing its tech regulations across the bloc, its digital chief Henna Virkkunen said Monday, despite threats from US President Donald Trump.
The rules protect rights including freedom of expression, she posted on X, adding: “I will keep enforcing them, for our kids, citizens and businesses.”
Brussels has already asserted its “sovereign right” to regulate the activities of tech giants wanting access to the European Union’s 450 million well-off consumers.
Its two main pieces of legislation—the Digital Services Act (DSA) and the Digital Markets Act (DMA)—aim to keep harmful content off the internet and ensure fair competition.
But Trump, who has shaken up global trade by imposing tariffs on America’s trading partners, has threatened to add levies on those he accuses of targeting US tech companies.
Virkkunen posted a link to a letter addressed to US Congress reiterating that the DSA and DMA were EU legislation with “no extraterritorial jurisdiction in the US or any other EU country.”
She countered claims that the EU rules amounted to “censorship”—made by the US State Department and detractors such as Meta chief Mark Zuckerberg—by stressing that the DSA upholds freedom of expression.
Its focus was to protect consumers, including against scams and fraud, “but also on defending our democracies and deliberate manipulation campaigns aimed at undermining free and fair elections.”
Virkkunen also objected to Congress inviting her predecessor in the previous European Commission, Thierry Breton, to appear before US lawmakers.
Citation:
EU vows to enforce tech rules, despite Trump pressure (2025, September 2)
retrieved 2 September 2025
from https://techxplore.com/news/2025-09-eu-vows-tech-trump-pressure.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
A battle has been waging in Sacramento over whether beverage cartons—the ones used for milk, juice, broth, wine, even egg whites—should get the coveted chasing arrows recycling label.
Earlier this year, the state agency in charge of recycling, CalRecycle, determined the cartons were probably not eligible, because they weren’t being sorted and recycled by the vast majority of the state’s waste haulers, a requirement of the state’s “Truth in Recycling” law, Senate Bill 54.
Three months later, the agency reversed course.
The label is critical for product and packaging companies to keep selling in California as the state’s single-use packaging law goes fully into effect. It calls for all single-use packaging products to be recyclable or compostable by 2032. If they’re not, they can’t be sold or distributed in the state.
According to internal agency emails, documents and industry news releases, the change was prompted by data from the carton packaging industry’s trade group, the Carton Council of North America. The council had also announced it was investing in a carton recycling facility in Lodi.
The waste agency’s reversal incensed several waste experts, anti-plastic activists and environmentalists, who say cartons have limited, if any, value or recycling potential. They say the new industry-backed facility in Lodi is nothing more than a facade—one of several similar operations that have failed across the country. CalRecycle’s revised determination about the recyclability of the material, they say, is based on flawed methods that are easy to exploit.
Some say it’s just the latest example of Gov. Gavin Newsom and CalRecycle retreating from the state’s landmark single-use plastic law, and other ambitious anti-waste and anti-plastic laws that he and the waste agency once touted.
“The big picture here is that the governor and CalRecycle are creating loopholes,” said Jan Dell, a chemical engineer and founder of Last Beach Cleanup, an anti-plastic organization.
“What we’ve got here is this Kingdom of California that wants to tell the world that ‘we’re the best in recycling, that recycling works, that we’re going to lead the way in recycling and build a circular economy.’ But, the reality on the ground is that this stuff’s not recyclable. It just isn’t.”
Yet others say what’s happened with carton material is exactly what the laws were designed to do: motivate plastic and packaging companies to make their packaging recyclable, or develop technologies and markets that will.
“We are gratified to see the Carton Council making these investments and demonstrating that recycling can work with a sincere commitment from industry,” said Sen. Ben Allen (D-Santa Monica), who authored both California’s truth in labeling and single-use plastic laws.
“For decades, Californians have been misled into believing that the tons of packaging we consume can be cleanly and effectively recycled if only we put it into the blue bin. Sadly, that is too often untrue.”
Melanie Turner, a CalRecycle spokeswoman, said the agency does not decide what products can get the recycling label; that is a decision made by the manufacturer. The agency’s role is to provide information to the manufacturer about the recyclability of the product in California.
The chasing arrows label has not only become increasingly important as the state’s single-use plastic law comes into effect, but it also provides comfort to consumers who are trying to minimize their environmental footprint.
Although at first glance most milk cartons appear to be primarily made of paper, they are actually comprised of alternating layers of paper, plastic and sometimes aluminum—a laminated sandwich of materials that extends a product’s shelf life, but also makes it hard to recycle.
The material is a challenge for commercial and residential waste haulers, said Robert Reed, a spokesman for Recology, a large waste hauling company in the Bay Area, Northern California, Oregon and Washington.
Not only are there few buyers for the milk-sodden cartons themselves (data show they currently fetch $0 in the recycling market), they risk contaminating other valuable items. For example, if more than 2% of a bale of mixed paper contains cartons, the bale is considered worthless.
In 2024, more than 106,000 tons (220 million pounds) of the old milk, juice and broth containers were dumped in landfills.
According to the Carton Council of North America, there are five facilities in North America that take cartons and try to give them new life. Four of them, in Wisconsin, Alabama, Canada, and Mexico, say they can harvest the paper fibers out of the containers and resell them to tissue and toilet paper manufacturers. All are more than 2,000 miles from downtown Los Angeles.
The fifth, a facility based in Waterbury, Connecticut, chops the blended material up, heats it so the plastic layer melts and turns into an adhesive, then presses it between two layers of fire-resistant material to create a gypsum-like roofing material.
It’s not clear if any of these facilities are paying for used cartons from waste operators, or taking them for free. None of the companies that operate these facilities responded to requests from The Times.
The carton council has announced it is investing in two new facilities (including the one in Lodi) where soiled cartons will be turned into roofing material.
But similar operations have either failed in the past, or never materialized. In 2022, the nationwide garbage operator Waste Management invested in a carton-to-roofing-material facility in Des Moines, Iowa. Two years later, it shut down with no explanation. Similar facilities in Colorado and Pennsylvania that were touted in news releases never materialized.
Waste Management did not respond to requests for comment.
In February, a consortium called ReCB, made up of the carton council and two corporate partners, purchased the abandoned Des Moines plant. According to Jan Rayman, ReCB managing director, the facility has been running 24/7 since June.
The two other partners include Elof Hansson U.S., a global trading company, and the Upcycling Group, a construction material production company co-founded by Rayman.
“We don’t use any glues or chemicals during the process. We don’t use any water in our manufacturing process. So we basically borrow the properties of the carton, and convert this composite package into a high-performance composite-building material,” he said.
He said the facility in Iowa pays for used cartons, rather than accepting them for free, indicating they have some value, a key point for the industry in establishing recyclability. Yet regional data from RecyclingMarkets.net shows the material’s value in the Midwest at $0 since January. There is no indication in regional data going back to 2013 that anyone will pay for used cartons.
A showcase facility
The consortium’s Lodi facility is in a rented warehouse on the northern edge of the city, not yet operating. Rayman said it is waiting on permits from the city.
On a recent weekday afternoon, it contained two new, bright blue state-of-the-art processing lines imported from the Czech Republic. They’ll be used to chop, heat and press the cartons. On the floor nearby, a bale of old milk, juice and soup cartons was attracting flies.
According to the carton council, when the facility is fully operational, it will be able to process 9,000 tons of cartons per year—or about 8.4% of what currently goes to state landfills every year. Rayman said that’s just the beginning; it will scale up as demand for his roofing product increases.
But even if it does, which Dell and others doubt, considering the track record of past operations, it’s the way that CalRecycle granted the recycling label that she says is most problematic.
Under California law, CalRecycle is supposed to find out whether the state’s waste operators are sorting a material at waste facilities. If they’re doing so for less than 60% of the state’s population, the material isn’t eligible for a recycling label.
In April, CalRecycle determined that only 47% of the state’s population, across 16 counties, had access to facilities that accepted cartons for recycling and sorted them out of the waste stream.
The state considers people to have access if a single waste hauler in their county accepts a material for recycling.
In other words, according to CalRecycle’s methodology, if one of Los Angeles’ 17 mechanical recycling facilities separates out food and beverage cartons, the county’s entire 9.8 million population—or nearly 25% of the state’s population—is served.
“It’s like saying that because you have air conditioning in one of L.A.’s 1,000 or more schools, then all the schools are air-conditioned,” said Dell. “It doesn’t make sense,” Dell said.
In fact, the state’s own Recycling and Disposal Reporting System shows that only one of the state’s 74 waste sorting operations sends carton bales off for recycling.
The state estimate of 47% meant the cartons were ineligible for the recycling label.
In the weeks that followed, however, the carton council provided the agency with new data, indicating that more than 70% of Californians, across 23 counties, have access. That higher percentage came in part from recycling operations that received new sorting machinery, called optical sorters, from the carton council.
“The endorsement or promotion of false recycling labels drives up costs for consumers because it ultimately leads to more contamination in curbside bins,” said Susan Keefe, the Southern California director for Beyond Plastics, an anti-plastic group based in Bennington, Vermont.
“Granting an unearned, false recycling label to the carton packaging companies disrespects California taxpayers, who have seen their recycling costs continue to climb year after year due to contamination and false promises of recyclability.”
2025 Los Angeles Times. Distributed by Tribune Content Agency, LLC.
Citation:
Inside the fight over the recycling label on milk cartons (2025, September 2)
retrieved 2 September 2025
from https://techxplore.com/news/2025-09-recycling-cartons.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
With agentic artificial intelligence (AI), we could be facing the biggest tech refresh event in history, where every organisation might deploy up to 2,000 agents per employee.
And to meet that need, the entire IT infrastructure – and storage in particular – will be affected.
Those are the views of Jeff Denworth, co-founder of Vast Data, who talks in this podcast about the challenges of agentic AI infrastructure for IT departments, the challenges to storage of agentic AI, and how customers can begin to meet those challenges across their datacentres and the cloud.
This includes being very careful to clearly specify and provision infrastructure while not over-buying, as well as ensuring storage and compute work hand in hand with application architectures and database teams.
What extra challenges does agentic AI pose for the IT infrastructure?
It’s a very broad question. But, to start, I think it’s important to point out that this is in some respects an entirely new form of business logic and a new form of computing.
And so, the first question becomes, if agentic systems are reasoning models coupled with agents that perform tasks by leveraging reasoning models, as well as different tools that have been allocated to them to help them accomplish their tasks … these models need to run on very high-performance machinery.
Today’s AI infrastructure often runs best on GPUs [graphics processing units] and other types of AI accelerators. And so, the first question becomes, how do you prepare the compute infrastructure for this new form of computing?
And here, customers talk about deploying AI factories and RAG [retrieval augmented generation], and AI agent deployment tends to be the initial use case people think about as they start to deploy these AI factories.
These are tightly coupled systems that require fast networks that interconnect very, very fast AI processors and GPUs, and then connect them to different data repositories and storage resources that you might want to go and feed these agents with.
The interesting thing about agentic infrastructure is that agents can ultimately work across a number of different datasets, and even in different domains. You have kind of two types of agents – workers, and other agents, which are supervisors or supervisory agents.
So, maybe I want to do something simple like develop a sales forecast for my product while reviewing all the customer conversations and the different databases or datasets that could inform my forecast.
Well, that would take me to having agents that work on and process a number of different independent datasets that may not even be in my datacentre. A great example is if you want something to go and process data in Salesforce, the supervisory agent may use an agent that has been deployed within Salesforce.com to go and sort out that part of the business system that it wants to process data on.
So, the first question becomes, how do you define this pipeline? How do you scope out all of the various data sources that you may want to process on? How do you size for what you would think is kind of a nominal operational workload, so that you’ve got enough compute resources for the steady state?
There are so many different facets of decision-making that come into play when people think they want to start deploying agentic workloads Jeff Denworth, Vast Data
And then, the compute discussion takes you down the path of datacentre and power infrastructure readiness, which is a whole different kettle of fish because some of these new systems – for example, the GB200 and L72 systems from Nvidia – are very tightly coupled racks of GPUs that have very fast networks between them. These require something like 120kW per datacentre rack, which most customers don’t have.
And then you start working through the considerations of my GPU requirements and where can I deploy them? In a colo? Is it in a datacentre I have? Is that potentially hosted in some cloud or neo-cloud environment? Neo clouds are these new AI clouds born in the era of AI. There are so many different facets of decision-making that come into play when people think they want to start deploying agentic workloads.
What are the key challenges for storage infrastructure, in particular, in agentic AI?
Well, just as with the first question, it’s really multidimensional.
I think the first thing to size up is what is storage in agentic AI? And this is something that has radically changed since people started training AI models. Most people generally worked under the assumption that if you have a good and fast file system, that’s good enough. And so, the difference here is that when people are training in the AI sense, and even fine-tuning, often these are very well-curated datasets that get fed into AI machinery, and you wait a few hours or a few days, and out pops a new model.
And that’s the level of interaction you have with underlying storage systems, other than that storage system also needing to be able to capture intermittent checkpoints to make sure that if the cluster fails, you can recover from some point in time in a job and start over.
If you think about agents, a user gets on a system and makes a prompt, and that prompt will then send the agent to do some sort of almost unpredictable level of computing, where the AI model will then go and look to work with different auxiliary datasets.
And it’s not just conventional storage, like file systems and object storage, that customers need. They also need databases. If you saw some of the announcements from Databricks, they’re talking about how AI systems are creating more databases now than humans have. And data warehouses are particularly important as AI agents look to reason across large-scale data warehouses.
So, anything that requires analytics requires a data warehouse. Anything that requires an understanding of unstructured data not only requires a file system or an object storage system, but it also requires a vector database to help AI agents understand what’s in those file systems through a process called retrieval augmented generative AI.
The first thing that needs to be wrestled down is a reconciliation of this idea that there’s all sorts of different data sources, and all of them need to be modernised or ready for the AI computing that is about to hit these data sources.
I like to kind of look at what’s changed and what hasn’t changed in the market. And it’s true that there’s all sorts of new applications that are being deployed in the form of new applications that use reasoning agents, and they use reasoning models as part of their business logic. But there’s also a lot of legacy applications that are now being up-levelled to also support this new type of AI computing.
And so, our general conclusion is that every single business application in the future will have some component of AI embedded into it. And there will be a whole bunch of new applications that also will be AI-centric that we haven’t planned for or don’t exist yet.
The common thread is that there’s this new style of computing that’s happening at the application level on a new type of processor that historically was not popular within the enterprise, which is a GPU or an AI processor. But I think the thing that people don’t realise is that the datasets they’ll be processing on is a lot of historic data.
So, whereas the opportunity to modernise a datacentre is greenfield at the application level and at the processor level or at the compute level, [there is] the brownfield opportunity to modernise the legacy data infrastructure that today holds the value and the information that these AI agents and reasoning models will look to process around.
We may be embarking on what could be the world’s largest technology refresh event in history Jeff Denworth, Vast Data
Then the question becomes, why would I modernise, and why is this important to me? That’s where scale comes back into the equation.
I think it’s important to checkpoint where we’re at with respect to agentic workflows and how that will impact the enterprise. It’s fair to say that pretty much anything that is routine or a process-bound approach to doing business will be automated as much as humanly possible.
There are now examples of many organisations that are not thinking about a few agents across the enterprise, but hundreds of thousands, and in certain cases, hundreds of millions of agents.
Nvidia, for example, made a very public statement that they’re going to be deploying 100 million agents over the next few years. And that would be at a time when their organisation will be 50,000 employees. Now, if I put these two statements together, what you have is roughly a 2,000 to one AI agent-to-employee ratio that you might think about planning for.
If this is true, a company of 10,000 employees would require large-scale supercomputing infrastructure just to process this level of agency. So, I think about it in terms of what the drivers are to modernise infrastructure. If just half or a fraction of this level of AI agent scale starts to hit a standard business, then every single legacy system that’s holding its data will be incapable of supporting the computational intensity that comes from this level of machinery.
And this is the thing that has us thinking we may be embarking on what could be the world’s largest technology refresh event in history. Probably the most recent one up until AI hit the market was virtualisation, which created new demands at the storage and database level. That same thing appears to be true for AI, as different customers we work with start to rethink data and storage infrastructure for large-scale agentic deployment.
How can customers ensure their infrastructure is up to the job for agentic AI?
It definitely requires some level of focus and understanding the customer workload.
But one of the things I see happening across the market is also over-rotation, where infrastructure practitioners will not necessarily understand the needs that come from either new business logic or AI research.
And so, they tend to overcompensate for the unknown. And that’s also pretty dangerous, because that creates a bad taste in the mouth for organisations that are starting to ramp into different AI initiatives when they realise, OK, we overbought here, we bought the wrong stuff here.
The first thing I would say is that there are best practices out in the market that should definitely be adhered to. Nvidia, for example, has done a really terrific job of helping articulate what customers need and sizing according to different GPU definitions, such that they can build infrastructure that’s general-purpose and optimised, but not necessarily over-architected.
The second thing that I would say is that hybrid cloud strategies definitely need to be reconciled, not only just for infrastructure-as-a-service – do I deploy stuff in my datacentre? do I deploy some stuff in different AI clouds or public clouds? – but also different SaaS [software-as-a-service] services.
The reason is that a lot of agentic work will happen there. You now have, for example, Slack, that has its own AI services in it. Pretty much any major SaaS offering also has an AI sub-component that includes some amount of agents. The best thing to do is sit down with the application architects team, which a lot of our storage customers don’t necessarily all have close connection to.
The second thing is to sit down with the database teams. Why? Because enterprise data warehouses need to be rethought and reimagined in this world of agentic computing, but also new types of databases are required in the form of vector databases. These have different requirements, at the infrastructure and compute as well as at the storage level.
Finally, there needs to be some harmonisation around what will happen with the datacentre and across different clouds. You need to talk to the different vendors you work with. That and the whole practice of helping people with this.
We’ve got something like roughly about 1.2 million GPUs that we’ve been powering around the world, and there’s all sorts of interesting approaches to not only sizing, but also future-proofing data systems by understanding how to continue to scale if different AI projects stick and prove to be successful.
by Simone Angster, DECHEMA Gesellschaft für Chemische Technik und Biotechnologie e.V.
Credit: Pixabay/CC0 Public Domain
The GreeN-H2-Namibia project has published three new reports that together provide key insights for Namibia’s emerging green economy. Covering topics from Power-to-X (PtX) technologies to regional water infrastructure, the reports address both technical and socio-economic challenges that decision-makers face in building a sustainable hydrogen sector.
The PtX report provides a comprehensive analysis of PtX, a key component of the green hydrogen economy, which converts renewable energy into storable fuels and chemicals. It details primary production pathways and evaluates their technical feasibility in Namibia’s context. Additionally, the report highlights potential applications, including transport fuels, industrial feedstocks, and energy storage, while assessing demand projections for the domestic market.
“Besides green ammonia and green steel, PtX includes the production of sustainable alternatives to petrochemicals, such as e-methanol or e-diesel,” states co-author Dr. Chokri Boumrifak. “However, these compounds require a carbon source that could be obtained from biogenic sources or hard to mitigate emissions, e.g. cement plants.”
Therefore, the authors of the report explore suitable carbon sources in Namibia. The extension of Namibia’s green hydrogen derivatives beyond green ammonia could also unlock further market opportunities in the future. Not only as PtX export commodities but also for suitable domestic industrial sectors as potential offtakers.
“Diesel is a widely used fuel in transportation, mining, agriculture, and fishing,” comments co-author Dr. Robin Ruff. “Additionally, ammonia is a precursor for fertilizers and explosives that could be used in agriculture and the mining sector.”
Although PtX products are expected to be cost intensive in short- and midterm scenarios, cost reductions through optimizations in the development of these technologies could potentially make PtX more feasible for Namibian industries.
A further report explores the potential of brine valorization in Namibia, particularly as desalination expands to support the country’s growing green hydrogen sector. While current regulations require environmental clearance for brine discharge, specific standards are missing to guide sustainable desalination practices. Market opportunities could be in sodium chloride, soda ash, sodium bicarbonate, and longer-term recovery of magnesium and lithium. There is a chance that a high-value, circular brine economy can be integrated with desalination and green energy hubs.
In parallel, the project has also released a report on water infrastructure in the Kharas Region, where Lüderitz and Aus are emerging as focal points for Namibia’s green hydrogen ambitions. The report compiles scattered data from diverse stakeholders into one coherent analysis, giving decision-makers a clearer basis for planning water infrastructure in the face of uncertainty.
This synthesis is particularly valuable for both Namibian and international stakeholders, who often lack a consolidated picture of local water constraints and investment needs in the context of green hydrogen development. Uncertainties range from whether hydrogen workers’ families will relocate, to what skills exist locally, the readiness of infrastructure, and how industrial development will actually unfold over time.
To address this, the report develops scenarios for future water demand built on transparent assumptions. The results underline the advantages of a modular approach to water infrastructure, which allows investments to grow with demand: meeting current and near-term needs without holding back other industries or urban growth, while keeping flexibility for larger green hydrogen projects.
By providing transparent assumptions and scenario-based pathways, the report also creates an accessible entry point for international financiers, development partners and private sector actors to understand where their support could make the most impact. The report calls for urgent solutions in Lüderitz and Aus and raises key questions that could shape not just local planning, but Namibia’s green hydrogen economy as a whole.
Together, these reports provide evidence-based guidance for policymakers, investors, and communities both within Namibia and internationally, particularly among investors, development agencies and decision-makers seeking to understand Namibia’s role in the global green hydrogen economy. They highlight both the opportunities and uncertainties of Namibia’s hydrogen transition, offering practical insights to ensure that industrial growth is matched with sustainable infrastructure and equitable development.
Provided by
DECHEMA Gesellschaft für Chemische Technik und Biotechnologie e.V.
Citation:
Three reports released supporting Namibia’s green hydrogen ambitions (2025, September 1)
retrieved 1 September 2025
from https://techxplore.com/news/2025-09-namibia-green-hydrogen-ambitions.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.