Tech
Auditing, classifying and building a data sovereignty strategy | Computer Weekly
Data sovereignty is a hot topic. For commercial and public sector organisations, compliance to ensure personal data is secure is a primary objective. And that means it cannot be subject to foreign laws or interference.
Data sovereignty is also a matter for international relations, where states strive to ensure citizen and organisation data is secure from foreign interference. And, for states, achieving data sovereignty is also a way of protecting and developing national economies.
In this article, we look at data sovereignty, and the key steps CIOs need to take to build their data sovereignty strategy. This centres on auditing, classification and building controls over data location and movement.
What is data sovereignty, and why is it an issue?
At the most general level, data sovereignty is the retention of data within the jurisdiction – usually state boundaries – whose laws govern its use.
Interest in data sovereignty has been building for some time. In one sense, it looks a lot like law catching up with the “wild west” early years of cloud use and popularity. Here, organisations rushed to this new, highly flexible location to process and store data, then later discovered the risks to which they – and their customer data – had become exposed.
More recently, the drive to digital sovereignty stepped up to the level of states. That trend got a big boost during US president Donald Trump’s first term. That saw the country’s introduction of the Clarifying Lawful Overseas Use of Data (Cloud) Act, for example, which potentially allows US law enforcement to access data stored by US companies anywhere. Alarm bells started ringing, especially in Europe.
Organisations achieve digital sovereignty in their operations by making data subject to the laws and control of the state they operate in, or from. But we are far from achieving that, when, for example, Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP) have around 70% of the European cloud market, and many European state organisations are completely or overwhelmingly dependent on US hyperscalers for cloud services.
What are the concerns about data sovereignty, and what do CIOs plan to do?
Surveys regularly find IT decision-makers are concerned about data sovereignty. A Gartner survey conducted among 241 IT decision-makers globally found the majority (75%) of those outside the US plan to have a digital sovereignty strategy in place by 2030. Meanwhile, 53% said concerns over geopolitics would restrict future use of global cloud providers, and 61% said such worries would increase their use of regional or local cloud providers.
Complexity – and the potential for contradictory regulations and increased costs – is also a major concern, says Simon Robinson, principal analyst for storage and data infrastructure at Omdia.
“Our research found 74% of organisations say sovereign clouds have become more important over the last two years,” he says.
“However, it is a complex and fast-moving area. The regulatory and compliance environment is evolving rapidly. But the challenge for global organisations is that some regulations may actually conflict, potentially forcing them to contemplate whether they might break one law or regulation to satisfy another.”
Robinson adds: “At the very least it pushes up costs, may lead to inconsistent data policies around retention, and could slow down the adoption of advanced technologies, such as AI [artificial intelligence].”
So, while risks around stored data being in datacentres in a foreign country, on foreign infrastructure and subject to that country’s laws are a major worry, resolving that situation can bring its own issues too.
What is a data sovereignty audit, and why is it so important?
Core to an organisation’s responses to an unknown or uncontrolled data sovereignty situation is an audit of its data. This is the first step towards ensuring data is kept and processed within the appropriate state boundaries.
That will likely take the form of identification of the risks around different classes of data, according to Jon Collins, vice-president of engagement and field chief technology officer at GigaOm.
“Not all data is created equal, and not all parts of the architecture are created equal,” he says. “The first step is to classify what you’ve got. Identify whether it needs to fall within the scope of sovereignty, understand what kind of data it is, and consider how it might be impacted in terms of privacy, localisation and compliance.”
Key parts of a digital sovereignty strategy include mapping digital assets and data flows throughout their lifecycle and the laws to which they are subject at all stages. Then classify the data to assess risk levels for each class.
This can include geo-tagging, and should be part of an ongoing process, says Bettina Tratz-Ryan, vice-president and analyst at Gartner. “Automated discovery tools help identify and tag sensitive data, whether in physical storage or incidental locations like shared drives and folders,” she adds.
“Regular audits and compliance checks are non-negotiable and require strong governance policies and periodic manual reviews.”
How to minimise exposure to data storage risks
A data storage strategy that addresses data sovereignty builds on the classification of data in the data audit to limit what data can go where.
As part of the classification process, data will be subject to a policy that manifests in metadata tagging that indicates its sensitivity and tolerance for movement.
“Organisations should adopt a data governance as code approach, automating compliance through infrastructure as code techniques for consistent enforcement and rapid remediation,” says Tratz-Ryan.
That means sensitive data should be stored locally or in regional datacentres to meet residency requirements, with the cloud used for scalability under strict, region-specific compliance requirements.
“Continuous monitoring, encryption and geo-fencing are essential, and governance must be built in, not bolted on,” adds Tratz-Ryan.
Such approaches address the difficulties that potentially arise with data in transit. With the ability to monitor compliance and auditability built in via classification and tagging, critical workloads can be more easily segregated from less sensitive data at rest and in transit.
“Strict governance over location and movement is the cornerstone of risk mitigation,” says Tratz-Ryan.
Challenges in maintaining knowledge and control
There are many challenges to data sovereignty auditing. Data moves, and it moves across borders. We might believe we have nailed down data in our infrastructure, while data finds other backdoor routes across frontiers. Meanwhile, proprietary systems present huge challenges to audits and tagging, and staff create shadow IT, use emails, attach files, and so on.
In short, data movement in an organisation can be very complex indeed. It is potentially simple to audit and control the vast bulk of our data, but the problems come with incidental cases of data movement, says Tratz-Ryan.
“In globally connected organisations, sovereignty risks will occur even if data is stored in local servers. Remote access, backups, and software-as-a-service integrations can create cross-border exposure, triggering compliance challenges under laws like the US Cloud Act. Also, governance can be bypassed by incidental data movement via virtual private networks, personal devices, or email,” she says.
“And, for example, an automotive manufacturer may store design files on-premise in one location, but metadata and backups can flow through global product lifecycle management systems, creating sovereignty exposure.
“Incidental data movement, such as emails, shared drives and collaboration tools, often push data into unsanctioned cloud folders, outside sovereign governance. Shadow IT compounds the problem when employees use external apps without IT oversight, creating blind spots.”
GigaOm’s Collins believes that for most, the key elements needed to incorporate data sovereignty compliance are already present in their organisation.
“It’s practical to consider it within your broader governance, risk and compliance framework,” he says. “The advantage is, as a larger organisation, you already have practices, processes and people in place for audit, reporting and oversight. Sovereignty requirements can be incorporated into those mechanisms.”
Collins says we should not assume all data needs to meet sovereignty rules, and that in many cases, it’s not possible to do so.
“For example, it’s not realistic to make email a fully sovereign, locally contained application because it’s inherently distributed,” says Collins. “But you can prevent sovereign data from being transmitted by email. That’s where data loss prevention and data protection policies come in, to make sure data from certain repositories, or of certain classifications, is not emailed out.”
Similarly with cloud. Rather than try to make all cloud folders sovereign, we should instead decide what data can and cannot be stored there. And if data needs to be stored locally, then it goes to a local on-premise or domestic cloud service or availability zone.
“The core debate is deciding whether a particular dataset is sovereign,” says Collins. “If you operate in a given country and you hold customer data about people in that country, then that data stays in that country. That gives you a clear list of what cannot go into cloud folders, be sent by email, or managed by a system that can’t guarantee localisation. Once you frame it that way, the whole thing becomes much more straightforward.”
Tech
The Man Behind AlphaGo Thinks AI Is Taking the Wrong Path
David Silver gave the world its very first glimpse of superintelligence.
In 2016, an AI program he developed at Google DeepMind, AlphaGo, taught itself to play the famously difficult game of Go with a kind of mastery that went far beyond mimicry.
Silver has since founded his own company, Ineffable Intelligence, that aims to build more general forms of AI superintelligence. The company will do this, Silver says, by focusing on reinforcement learning, which involves AI models learning new capabilities through trial and error. The vision is to create “superlearners” that go beyond human intelligence in many domains.
This approach stands in contrast to how most AI companies plan to build superintelligence, by exploiting the coding and research capabilities of large-language models.
Silver, speaking to WIRED from his office in London, says he thinks this approach will fail. As amazing as LLMs are, they learn from human intelligence—rather than building their own.
“Human data is like a kind of fossil fuel that has provided an amazing shortcut,” Silver says. “You can think of systems that learn for themselves as a renewable fuel—something that can just learn and learn and learn forever, without limit,” he says.
I’ve met Silver a few times and—despite this proclamation—he’s always struck me as one of the more humble people in AI. Sometimes, when talking about ideas he considers silly, he flashes a puckish grin. Right now, though, he’s deadly serious.
“I think of our mission as making first contact with superintelligence,” he says. “By superintelligence I really mean something incredible. It should discover new forms of science or technology or government or economics for itself.”
Five years ago, such a mission might have seemed ridiculous. But tech CEOs now routinely talk about machines outpacing human intelligence and replacing entire categories of workers. The idea that some new technical twist might unlock superhuman AI capabilities has recently spawned a raft of billion-dollar startups.
Ineffable Intelligence has so far raised $1.1 billion in seed funding at a valuation of $5.1 billion—an enormous sum by European AI standards. Silver has also recruited top AI researchers from Google DeepMind and other frontier labs to join his endeavor.
Silver says he will give all of the money he makes from equity in Effable Intelligence—a sum that could amount to billions if he is successful—away to charity.
“It’s a huge responsibility to build a company focusing on superintelligence,” he tells me. “I think this is something that has to be done for the benefit of humanity, and any money that I make from Ineffable will will go to high-impact charities that save as many lives as possible.”
Total Focus
Silver met Demis Hassabis, the CEO of Google DeepMind, at a chess tournament when they were kids, and the pair later became lifelong friends and collaborators.
They remained close after Silver left Google DeepMind, which he did only because he wanted to chart a completely new path. “I feel it’s really important that there is an elite AI lab that actually focuses a hundred percent on this approach,” he says. “That it’s not just a corner of another place dedicated to LLMs.”
The limits of the LLM-based approach can be seen, Silver says, with a simple thought experiment. Imagine going back in time and releasing a large language model in a world that believed the world was flat. Without being able to interact with the real world, the system, he says, would remain an avid flat-earther, even if it continued to improve its own code.
An AI system that can learn about the world for itself, however, could make its own scientific discoveries.
Tech
The Best iPhone Charger for Late-Night Doomscrolling
The best iPhone charger depends on several factors. Are you topping off your battery on the go? Do you want to charge your iPhone as quickly as possible? Are you charging it overnight on your nightstand? The best gear recommendation is going to change with the situation. Luckily, the WIRED Reviews team tests iPhone chargers in the field all year long. There’s not a day that goes by that at least one of us is not assessing at least one iPhone charger. I’ve gathered up our favorite picks for every scenario.
Be sure to check out our related buying guides, like the Best Power Banks, the Best 3-in-1 Chargers, and the Best Wireless Chargers.
Table of Contents
The Best iPhone Chargers
Best Wall Charger for iPhone
This Anker charger is slick and has folding prongs so it’s easy to travel with, but the best part is that it can charge your phone at 40 watts (average is 20 to 27 watts). That means you can get up to 50 percent battery life in only 20 minutes. Not all iPhone models support charging this fast—it’s limited to iPhone 17, iPhone 17 Pro, and iPhone 17 Pro Max—but you may as well future-proof your gear if you’re shopping for a wall charger, even if your phone can’t take full advantage of those speeds yet.
Best Power Bank for iPhone
We do recommend the Anker Laptop Power Bank as our top-pick power bank, but if you’re only trying to top off your iPhone, this is a very reliable and neat-looking power bank. It’s svelte, smaller than a deck of cards, and can deliver 20 watts to two devices at once. Nimble also makes a slightly larger version, which has a larger capacity and can charge at up to 65 watts. Aside from the cool design featuring speckled colors and a lanyard loop, Nimble also uses bioplastics, recycled materials, and minimal packaging. A USB-C charger is included in the box.
Best MagSafe Portable Charger for iPhone
This 10,000-mAh power bank can charge your device at up to 15 watts, but it’ll also charge older devices at a slower rate. It has a built-in kickstand and an LED display that lets you know how much power is left at a glance. It works in portrait or landscape modes. Be aware that it won’t be able to charge most phones fully more than once, but it’s hard to beat if you’re seeking wireless charging on the go. If you want a bigger capacity or faster charging, you don’t want MagSafe.
Best 3-in-1 Charger for iPhone
The Belkin 3-in-1 can charge your compatible iPhone at 15 watts, plus your AirPods and your Apple Watch at the same time. The charging pad can be tilted to your preferred angle, including in landscape orientation if you want to watch a video or put your phone in StandBy mode. The USB-C cable is permanently attached, which you may or may not like. Check our best 3-in-1 chargers buying guide for additional picks.
Best 2-in-1 Charger for iPhone
I love a 3-in-1 charger as much as the next tech nerd, but sometimes they’re overkill. My Apple Watch battery usually lasts all day long, but I can chew through my older AirPods battery before my lunch break hits, and my iPhone battery might be depleted too, depending on whether or not I’m streaming Max Velocity off to the side. This 2-in-1 charger has been my steadfast desktop companion. Mophie makes another version that tops off your Apple Watch and iPhone instead of your headphones, which might be what you want if you’re rocking wired headphones or you’re making intense use of a walking pad throughout the day. There’s a 40-watt wall charger in the box—a rarity these days!—plus a USB-C cable that winds neatly into the base. It’s easy to adjust the angle of your iPhone as well, and I’ve found the base very sturdy. If you want to charge, but not necessarily all of the possible devices simultaneously, these might be what you seek.
This braided nylon USB-C cable has a durable exterior made from recycled plastic. The cable is rugged, with Anker promising that it can operate in temperatures ranging from negative 40 degrees to 176 degrees Fahrenheit. It’s backed by a lifetime warranty. It’s got a built-in cable management loop. It’s more than enough cable for your iPhone. Read our guide to the Best USB-C Cables for more picks.
If your iPhone is still rocking the Lightning cable, this is gonna be way better than whatever shoddy cable Apple sent you. It’s durable and is Made for iPhone-certified, so you won’t have any problems getting it to work. It comes in 3-, 6-, or 10-foot lengths with a two-year warranty. Best of all, the exterior casing will stay intact, unlike what you’d probably get with Apple’s cables.
Tech
DSIT gets sums badly wrong on AI datacentre carbon footprint | Computer Weekly
Government figures for projected carbon footprint for datacentres have been miscalculated. New figures for the likely carbon output resulting from electricity use by datacentres published last week have been revised upwards by around 100x for the minimum and maximum projected.
Last July, the Department for Science, Innovation & Technology (DSIT) published its Compute evidence annex, which set out the future for AI, compute demand and implications for carbon footprint. The DSIT has since unpublished the original report, but it can still be found here.
The document said: “We estimate that by 2035, the UK’s greenhouse gas emissions from AI compute could range from 0.025 to 0.142 MtCO₂ [millions of tonnes of CO₂] – this is below 0.05% of the UK’s projected total emissions.”
But in a correction to that document, the DSIT said last week: “The UK’s cumulative 10-year greenhouse gas emissions from AI compute could range from 34 to 123 MtCO₂ – this is around 0.9-3.4% of the UK’s projected total emissions over the 10-year period.”
The figures were miscalculated to a staggering degree. The earlier numbers appear to have been annual and the recent revision a 10-year figure, which makes an increase in the estimate of around 100x.
Meanwhile, analysis by climate change science and policy research group Carbon Brief suggests even those figures might be optimistic. Core to that belief is that the government aim is for 50gCO2/kWh by 2030. That figure is what can be achieved by “clean” sources of energy, such as wind, nuclear, hydro and solar.
But figures from last month – researched by Carbon Brief and published with environmental campaigners Foxglove – suggest that is a wildly optimistic estimate if any of that power generation needs to be powered by gas, as gas-powered electricity generation comes with a carbon intensity of around 10x that of clean sources.
Carbon Brief has calculated that emissions could in fact be somewhere between 3.4 MtCO₂ using 5% gas, and 68.1 MtCO₂ if electricity was 95% gas-generated. The higher figure – not far off the annual carbon emissions of Sweden – comes from an estimate based on a recent Ofgem projection of 20GW of future datacentre electricity demand. The same document illustrated the scale of demand by reference to actual peak demand in February 2026 of 45GW.
Ofgem’s 20GW is a projection based on National Energy System Operator research that asked customers about future grid connection requirements.
Foxglove head of strategy Tim Squirrell said: “The government has a legally binding commitment to reach net zero by 2050. This already sat awkwardly alongside its hell-for-leather embrace of a hyperscale AI datacentre buildout, which unchecked could double the electricity consumption of the entire country.
“The situation has now been revealed to be much, much worse, given the fact the government doesn’t seem to have done even the most basic arithmetic needed to measure the potential new carbon emissions of these datacentres. The government urgently needs to confront the reality that it can’t rubber stamp hundreds of new datacentres, whilst keeping its manifesto promise to the country – and legal obligation – to combat the climate crisis.”
Computer Weekly has calculated that there is currently around 1.6GW of datacentre capacity in the UK, with just over 8GW currently in planning or under construction.
-
Sports1 week agoNCAA men’s gymnastics championship: All-time winners list
-
Sports1 week agoWWE WrestleMania 42 Night 2: Live match results and analysis
-
Fashion1 week agoUK’s Sosandar returns to profitability amid robust FY26 performance
-
Politics6 days agoUK’s Starmer seeks to deflect blame over Mandelson appointment
-
Business1 week agoNo fuel shortage: Govt assures 100% domestic LPG, PNG, CNG supply amid Hormuz energy crunch – The Times of India
-
Entertainment7 days agoLee Anderson, Zarah Sultana kicked out of UK Parliament for calling PM ‘liar’
-
Business7 days agoHow Trump’s psychedelics executive order could unlock stalled cannabis reform
-
Business6 days agoUs-India Trade Talks: US–India trade deal: Where do talks stand & what to expect – explained – The Times of India

-Portable-Charger-Reviewer-Photo-(no-border)-SOURCE-Simon-Hill.jpg)
.jpg)



