Tech
Heightened global risk pushes interest in data sovereignty | Computer Weekly
Heightened risk related to data sovereignty is universally acknowledged. Most IT decision makers see that risk increasing as a result of geopolitical instability, and that inadequate preparation could result in costly reputational damage and a loss of customer trust.
Those are the key findings of a Pure Storage-sponsored survey in which the University of Technology Sydney carried out interview-based qualitative research among IT practitioners in the Europe and Asia-Pacific regions.
The survey found:
- 100% of those asked believed sovereignty risks that include potential service disruption have forced organisations to reconsider where data is located;
- 92% said geopolitical shifts had increased sovereignty risks;
- 92% believed inadequate sovereignty planning could lead to reputational damage;
- 85% identified loss of customer trust as the key consequence of inaction;
- 78% said they had embraced data strategies that included engaging with multiple service providers; adopting sovereign datacentres (on-premise or in-country), and building enhanced governance requirements into commercial agreements.
The survey commentary talks of a “perfect storm” where service disruption risks, foreign influence and evolving regulations converge to create huge exposure to risk for organisations that could result in revenue loss, regulatory penalties and irreparable damage to stakeholder trust if not addressed.
One IT decision maker talked about how complex data sovereignty can be to unpick, and how it now forms key planks of their organisation’s agreements with customers.
“The Access Group handles sensitive end user data for our customers across the world, from the NHS in the UK to the Tax Department in Australia,” said Rolf Krolke, regional technology director for APAC with The Access Group. “Data sovereignty is an absolutely critical issue for us and our customers. In fact, they ask that it be written into our contracts.”
The concept of data sovereignty centres on the idea that information created, processed, converted and stored in digital form is subject to the laws of the country in which it was generated. But data can travel, too, and when it does, its destination country’s laws on data held there that must be adhered to. That is known as data residency.
Difficulties can arise when the two concepts meet and the laws of one state contradict another, such as with the European Union’s General Data Protection Regulation, which requires that data transferred to another jurisdiction is held with adequate safeguards and protections.
For such reasons, organisations often want to know where their data goes, and also might want to keep it in known – often home country – locations.
Such concerns have been heightened in the recent climate of geopolitical instability, as well as the febrile climate that has grown around international cyber crime.
The rise in use of the cloud is core to many of the concerns and the difficulties that arise.
Datacentre locations
Also present as concerns are datacentre locations and the global supply chain, said Patrick Smith, EMEA chief technology officer of Pure Storage, who suggests organisations and states will need to move to – or are already moving towards – building their own sovereign capacity.
This, he said, means physical equipment and in-country datacentre capacity, and that’s not a trivial obstacle to surmount.
“It’s interesting when you think about some of the constrained components that we’ve seen on the global stage,” said Smith. “A great example is Nvidia GPUs [graphics processing units], which require almost a global village to produce them.
“As soon as you start looking at data sovereignty, you’re looking at, ‘How do I build my sovereign capability? Where do I get all the components from?’ Many countries have effectively outsourced datacentres. They’ve put them outside of their own geography.
“With a sovereign capability, you’re talking about having to host those datacentres within your own borders,” he said. “And that suddenly means that you need to have that energy production and water supply to support that datacentre.”
Tech
Blood Tests for Alzheimer’s Are Here
Last month, The US Food and Drug Administration approved a new blood test for assisting the diagnosis of Alzheimer’s disease. Produced by Roche, Elecsys pTau181 measures the concentration of a specific molecule—a phosphorylated form of the tau protein—in the blood. Tau is one of two proteins, the other being amyloid, that become malformed and accumulate in the brains of patients with certain types of dementia. It is believed that the buildup of these proteins interferes with the communication of brain cells, leading to these patients’ symptoms.
The test had already received authorization in July for marketing in Europe and is thus the first early screening system for Alzheimer’s for use in primary care settings approved in the planet’s two major pharmaceutical markets. It is an opener in what should soon become a crowded field, as there are several other tests in advanced stages of testing and approval.
How Do Such Tests Work?
Elecsys pTau181 looks in the blood plasma for a form of the tau protein that has a phosphate group attached, which is often found in elevated amounts in Alzheimer’s patients. This molecule is an indirect marker of the plaques of amyloid and neurofibrillary tangles of tau observed in the brains of patients with the disease.
Some other tests have also been approved, though not for early screening. These assess other biomarkers that relate to these two proteins. One test, called Lumipulse and made by the Japanese company Fujirebio, looks at the ratio between another form of phosphorylated tau (pTau217) and a key protein fragment that forms amyloid plaques (amyloid beta peptide 1-42).
The bottom line is that these tests offer clues to the probable presence of amyloidosis in the brain, which then needs to be diagnosed with greater accuracy using more invasive tests, such as a PET (positron emission tomography) scan and cerebrospinal fluid analysis by lumbar puncture, considered the clinical gold standard for diagnosing amyloid pathology in living patients. Even these, however, come with some degree of uncertainty; true diagnostic certainty can only be had with a post-mortem dissection of the brain.
Why Approve These Tests Now?
In the past, confirmation of an Alzheimer’s diagnosis was not that important, as there were no drugs or therapies that could alter the course of the disease. But with the approval of new Alzheimer’s monoclonal antibody treatments, the landscape has changed in the past few years.
To use these medicines, you need a way to confirm which patients can benefit. And since the drugs ideally yield the best results when used early on in the disease’s progression, a relatively inexpensive and minimally invasive diagnostic test will be extremely useful. Subjecting all elderly people with suspected symptoms of cognitive decline to PET scans and cerebrospinal fluid sampling is impractical, so this is where blood testing for Alzheimer’s comes in.
Just How Useful Are These Tests?
Elecsys pTau181 is the first test to be approved for use as a community-screening tool. The idea is for it to be administered at the primary care level—so, for instance, by a primary care physician or general practitioner. The test has been shown to have a good “negative predictive value”—that is, it is effective at accurately indicating who does not have amyloid disease. In settings where the overall prevalence of amyloid disease is low, a negative result from this test is 97.9 percent reliable. This makes it useful for selecting which patients to put forward for further testing.
The results are similar to those of other tests that have already been approved in recent months, such as Lumipulse from Japan’s Fujirebio, which in trials has shown a negative predictive value of about 97 percent.
However, there is an important limitation to note: for all blood tests for Alzheimer’s, there tends to be a relatively large proportion of patients (15-30 percent is a common estimate) who fall into a gray area of uncertainty, in which the levels of identified biomarkers do not allow for either a positive or a negative answer.
Tech
Buried power lines could cut weather-related outages
A Stanford analysis shows that strategic investment in burying power lines could shorten blackouts during extreme weather, enhancing energy reliability for millions of U.S. households.
As hurricanes intensify, wildfires spread, and winter storm patterns shift, the combination of extreme weather events and aging grid infrastructure threatens to make energy less reliable for tens of millions of U.S. households.
Experts say burying power lines underground can harden the electrical system against threats from wind, ice, falling trees, and other weather-related hazards. Yet undergrounding power lines remains expensive and unevenly implemented. One obstacle has been a lack of information about where investments in undergrounding by utilities and communities could make the biggest difference for reliable power supplies.
In a recent study posted to the arXiv preprint server, Stanford University researchers led by Associate Professor Ram Rajagopal combined previously non-public and siloed datasets to reveal how the distribution of power lines above and below ground has changed since the 1990s. By combining these data with power outage records, the team modeled how having more power lines underground during recent extreme weather events could have shortened outages.

Patchy progress on burying power lines since 1990
Dense metropolitan areas on the East Coast, parts of southern Florida, and a few southwestern growth hubs were among the first to underground at least a quarter of their power line mileage. The overwhelming majority of power lines remained overhead in most U.S. counties in 1990.
By 2020, some fast-growing suburbs in southeastern and Sunbelt states showed modest increases in undergrounding. For most counties nationwide, however, the median percentage of power lines buried underground remained well below 15%. Large swaths of the Rockies, Midwest, and Gulf Coast showed virtually no change.
Where outages last the longest
Each year, tens of millions of Americans experience power outages. While households on average lose electricity for about four hours over the course of a year, some outages last a day or even weeks. Many of these longer outages are linked to extreme weather events.

New England’s 2017 ‘bomb cyclone’
A nor’easter or “bomb cyclone” that struck Maine, Vermont, and New Hampshire in October 2017 left people without power on average for 27.3 hours per home. The Stanford analysis found that burying an additional 25% of overhead power lines could have cut annual outage totals by 10.8 hours.
-

Annual average power outage time for 2017, on a scale from less than one hour (lightest shades) to more than 24 hours (darkest shades). Credit: arXiv (2024). DOI: 10.48550/arxiv.2402.06668
-

Undergrounding an additional 25% of power lines could have reduced outages by 10.8 hours (39.7%). Credit: arXiv (2024). DOI: 10.48550/arxiv.2402.06668
California’s 2019 wildfire shutoffs
Amid dry conditions and strong winds in 2019, more than 3 million Californians lost power when utilities preemptively shut down equipment in high-fire-risk areas. The Stanford analysis found that undergrounding an additional 25% of overhead power lines would have cut annual outage totals in the affected area to roughly eight hours from 10.5 hours.
Texas’s 2021 deep freeze
In February 2021, unusually cold temperatures in Texas left 4.5 million homes and businesses without power for just over 19 hours. The researchers found having 25% more power lines underground during this event also could have shortened average outage times by 2.5 hours.
Explore the data
You can view more analysis from the Stanford researchers and explore county-level undergrounding and outage patterns in an interactive project developed by the Stanford Doerr School of Sustainability in collaboration with TechSoup. The researchers have made their 2020 data on the proportion of underground distribution power lines publicly available through Stanford’s Data Commons for Sustainability.
More information:
Tao Sun et al, Mapping the Depths: A Stocktake of Underground Power Distribution in United States, arXiv (2024). DOI: 10.48550/arxiv.2402.06668
Citation:
Buried power lines could cut weather-related outages (2025, November 5)
retrieved 5 November 2025
from https://techxplore.com/news/2025-11-power-lines-weather-outages.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Tech
Omada unveils software upgrades to accelerate smarter networking | Computer Weekly
In an upgrade that spans network planning to management, and marking a “significant evolution” in its ecosystem, Omada has embarked on a software refresh designed to enable to plan smarter networks that can be deployed faster and managed with greater precision and confidence.
The upgrades from TP-Link Systems business solution brand includes enhancements to Omada Network 6.0, Omada App 5.0, Wi-Fi Navi App V1.5 and a new Omada Design Hub. These upgrades are designed to deliver a smarter, more integrated experience for MSPs, system integrators (SIs) and installers as well as everyday users. With end-to-end tools for planning, deployment and management, Omada claimed that it can empower businesses to build high performance networks with greater speed, precision and reliability.
At the heart of the upgrades is Omada Network 6.0, described as a major refit designed to simplify and supercharge network operations. Listed as being built for professionals who are managing complex deployments, it offers a new interface and enhanced interactions to make troubleshooting faster, monitoring more precise and configuration more intuitive.
The redesigned dashboard features a five-tab layout – including overview, topology, Wi-Fi, client and traffic – to deliver better visual insights, while the newly designed interface and menus are aimed at making configuration and management experience smoother. New visualisations, such as AP density maps and heatmaps, are intended to help IT teams understand user behaviour and deployment performance at a glance.
The company said its “standout addition” is the multi-level health scoring system available in the cloud-based controller. It is engineered automatically evaluates the status of devices, clients, WLANs and sites, enabling simplified monitoring and early detection of issues across multiple layers.
Smart Topology has also been upgraded with real-time VLAN visibility and disconnected device tracking. Customisable filters make it easier to locate faults and streamline troubleshooting. Enhanced client recognition now identifies device type, brand and models automatically, while the new device and client page visualises activity timelines and event history for full lifecycle management.
Omada claims that network configuration is faster than ever with a simple three step VLAN setup and centralised bulk port management across switches. These improvements set out to eliminate guesswork and reduce configuration time from hours to minutes, especially in large-scale deployments.
Integrated with Omada’s core solution, the Omada Design Hub is a free, cloud-based network planner, offering AI-powered precision during each stage of deployment. Design Hub helps to “simulate, visualise and deliver tailored solutions” in use cases such as designing for offices, homes, hotels or schools.
Users can now upload floor plans, auto-detect walls and instantly generate Wi-Fi heatmaps. The platform supports auto AP placement and cabling, including cross-floor connections, and one-click proposal exports with topology maps, device lists and simulation results. It supports users to personalise reports for clients, speeding up communication and delivery.
Bulk adjustments, editable equipment lists with pricing, and real-time topology tools have been updated to make planning faster and more accurate. Adaptive spatial models and signal strength calculations ensure reliable coverage and installation-ready designs.
Meanwhile, the Omada Wi-Fi Navi App V1.5, a free networking troubleshooting tools, expands its toolkit for installers and administrators. New features include Wi-Fi Integrated Test, Walking Test, IP/Port Scanners, Public IP Lookup, and Bandwidth/PoE calculators. It also includes iPerf2 support and improved scanning for deployment validation and on-site issue resolution.
-
Tech1 week agoOpenAI says a million ChatGPT users talk about suicide
-
Tech1 week agoUS Ralph Lauren partners with Microsoft for AI shopping experience
-
Tech1 week agoHow digital technologies can support a circular economy
-
Sports1 week agoBilly Bob Thornton dishes on Cowboys owner Jerry Jones’ acting prowess after ‘Landman’ cameo
-
Tech1 week agoAI chatbots are becoming everyday tools for mundane tasks, use data shows
-
Fashion1 week agoTaiwan Textile Select showcases sustainable innovation at TITAS 2025
-
Fashion1 week agoITMF elects new board at 2025 Yogyakarta conference
-
Tech1 week agoHere’s How Many People May Use ChatGPT During a Mental Health Crisis Each Week
