Tech
Storage data management tools: What they do and what’s available | Computer Weekly
Pure Storage’s recent launch of its Enterprise Data Cloud reignited debate around storage and data management.
Pure claims its EDC addresses the management of growing volumes of data in a complex regulatory environment, and the demands storage faces from artificial intelligence (AI) workloads.
The idea of a single management layer for storage is not new. Pure is not the only supplier looking to automate storage provision and management, where storage comes in “fleets”, and data management and governance take place across local, cloud and hybrid infrastructure.
But while analysts agree Pure has a technical edge for now, most suppliers offer tools that work across on-premise and cloud technologies with the aim of reducing storage management overheads through automation and AI.
Analyst GigaOm, for example, rates Pure Storage as a leader in data pipeline support, especially for demanding AI deployments, alongside Hitachi Vantara, HPE, IBM, NetApp and Dell Technologies.
“Adopting high-performance storage optimised for AI workloads is a strategic business imperative, not merely a technical upgrade,” says Whit Walters, field chief technology officer at GigaOm.
From storage to data management
AI’s demands for vast amounts of data has pushed chief information officers (CIOs) and suppliers to look beyond technical infrastructure management of storage, and to a wider concept of data management.
This includes managing conventional metrics, such as capacity, performance and availability, and routine tasks such as provisioning and backup, to issues such as data location for compliance and ransomware protection.
At a basic level, CIOs need to control all of a supplier’s products from a single control plane, from on-premise to the cloud. This includes day-to-day tasks like provisioning, data migration and upgrades, as well as robust monitoring. Ideally, data management should integrate with the supplier’s as-a-service tools, too.
But all this becomes harder as data volumes and performance requirements increase.
“There is a growing challenge with managing enterprise infrastructure at scale,” explains Simon Robinson, principal analyst at Enterprise Storage Group. “This is not a new problem. Infrastructure and operations teams spend too much time instrumenting, fine tuning, provisioning and managing capacity for their enterprise workload. Storage is still pretty onerous in that respect.”
Improvements in storage management, he says, have mostly been technical, such as thin provisioning, and at the array level. This makes it harder to scale systems, and fails to account for integration with the cloud.
“Now the control plane needs to extend across the on-premise environment and the public cloud,” says Robinson. “That is a really difficult problem to solve.”
Meanwhile, data and storage management tools rarely work across rival supplier platforms. Even though platform-neutral storage management has been tried, suppliers reverted to their own tools, with extensions into cloud environments.
The argument is that single supplier tools offer a performance advantage that outweighs the drawbacks.
“Going back 10 years, the goal was to consistently manage a heterogeneous vendor environment,” says Robinson. “That hasn’t materialised. The trade off with all of these approaches is that you are going to get the best results if you standardise around a particular vendor’s systems.”
Some supplier offerings, such as IBM Storage Virtualize, provide multi-supplier support. But most, such as Pure’s EDC, assume IT leaders will trade compatibility for performance.
Here, we list some key data management features of the main data storage suppliers.
Dell Technologies
Dell’s PowerScale technology provides a scale-out architecture, supporting management of local and cloud storage from the same interface.
Dell includes data management for AI and unstructured data, through DataIQ (for unstructured data) and CloudIQ (for cloud).
DataIQ works across Dell EMC PowerScale and Isilon hardware, as well as S3 compatible cloud storage. Though Apex, Dell also provides a platform for multi-cloud management, although it is not specific to storage.
HPE
HPE says its Alletra Storage product gives a “cloud experience” for workloads locally or in the cloud. Its Greenlake platform provides as-a-service storage across on-premise, hybrid and cloud.
Zerto offers data protection across hybrid environments. Alongside this, HPE’s Data Management Framework 7 provides data management tools across high-performance and AI storage, including tiering and automated file movement.
Huawei
Huawei’s data management engine (DME) provides provisioning, lifecycle management, alerting and anomaly detection. It also supports multi-cloud operations, and uses AI to predict system risks, through DME IQ.
DME supports Huawei’s own arrays and its FusionStorage, as well as some support for third-party hardware and hosts such as ESXi.
IBM
IBM has a wide range of storage and data management capabilities, split across a range of tools. Storage Virtualize is a long-established tool able to manage hardware in multi-supplier environments.
IBM Storage Insights Pro is subscription-based, and provides inventory, capacity and performance management for IBM and non-IBM block storage.
IBM Storage Scale provides high-performance data management, while IBM Spectrum Control delivers monitoring and analytics across multiple suppliers on-premise and in the cloud.
NetApp
NetApp has a range of storage and data management capabilities, including through its Ontap storage operating system, its StorageGrid multi-cloud technology and its Keystone as-a-service offering.
Keystone can control storage across on-premise and the cloud, and includes governance, compliance and ransomware protection, as well as deployment and management tools. BlueXP allows users to control storage and data services across local and cloud systems.
Hitachi Vantara
Hitachi Vantara’s VSP One offers a single data plane to integrate data and simplify management across on-premise and cloud. It supports block, file and object, as well as software-defined storage (SDS) and, unusually, support for mainframes.
VSP One SDS can run on third-party hardware, as well as on Amazon’s cloud. VSP 360 provides cloud orchestration as well as fleet management; Everflex provides storage-as-a-service.
Pure Storage
Enterprise Data Cloud allows customers to manage data across a “storage cloud”, regardless of the location of physical storage. This allows customers to focus on managing data, it says.
It also allows any Pure array to work as an endpoint for the fleet. EDC is made up of Pure’s hardware layer, its cloud-based Pure1 storage management and optimisation platform, and its Pure Fusion control plane for fleet management.
Tech
A Possible US Government iPhone-Hacking Toolkit Is Now in the Hands of Foreign Spies and Criminals
Google notes that Apple patched vulnerabilities used by Coruna in the latest versions of its mobile operating system, iOS 26, so its exploitation techniques are only confirmed to work against iOS 13 through 17.2.1. It targets vulnerabilities in Apple’s Webkit framework for browsers, so Safari users on those older versions of iOS would be vulnerable, but there’s no confirmed techniques in the toolkit for targeting Chrome users. Google also notes that Coruna checks if an iOS devices has Apple’s most stringent security setting, known as Lockdown Mode, enabled, and doesn’t attempt to hack it if so.
Despite those limitations, iVerify says Coruna likely infected tens of thousands of phones. The company consulted with a partner that has access to network traffic and counted visits to a command-and-control server for the cybercriminal version of Coruna infecting Chinese-language websites. The volume of those connections suggest, iVerify says, that roughly 42,000devices may have already been hacked with the toolkit in the for-profit campaign alone.
Just how many other victims Coruna may have hit, including Ukrainians who visited websites infected with the code by the suspected Russian espionage operation, remains unclear. Google declined to comment beyond its published report. Apple did not immediately provide comment on Google or iVerify’s findings.
In iVerify’s analysis of the cybercriminal version of Coruna—it didn’t have access to any of the earlier versions—the company found that the code appeared to have been altered to plant malware on target devices designed to drain cryptocurrency from crypto wallets as well as steal photos and, in some cases, emails. Those additions, however, were “poorly written” compared to the underlying Coruna toolkit, according to iVerify chief product officer Spencer Parker, which he found to be impressively polished and modular.
“My god, these things are very professionally written,” Parker says of the exploits included in Coruna, suggesting that the cruder malware was added by the cybercriminals who later obtained that code.
As for the clues that suggest Coruna’s origins as a US government toolkit, iVerify’s Cole notes that it’s possible that Coruna’s code overlap with the Operation Triangulation code that Russia pinned on US hackers could be based on Triangulation’s components being picked up and repurposed after they were discovered. But Cole argues that’s unlikely. Many components of Coruna have never been seen before, he points out, and the whole toolkit appears to have been created by a “single author,” as he puts it.
“The framework holds together very well,” says Cole, who previously worked at the NSA, but notes that he’s been out of the government for more than a decade and isn’t basing any findings on his own outdated knowledge of US hacking tools. “It looks like it was written as a whole. It doesn’t look like it was pieced together.”
If Coruna is, in fact, a US hacking toolkit gone rogue, just how it got into foreign and criminal hands remains a mystery. But Cole points to the industry of brokers that may pay tens of millions of dollars for zero-day hacking techniques that they can resell for espionage, cybercrime, or cyberwar. Notably, Peter Williams, an executive of US government contractor Trenchant, was sentenced this month to seven years in prison for selling hacking tools to the Russian zero-day broker Operation Zero from 2022 to 2025. Williams’ sentencing memo notes that Trenchant sold hacking tools to the US intelligence community as well as others in the “Five Eyes” group of English-speaking governments—the US, UK, Australia, Canada and New Zealand—though it’s not clear what specific tools he sold or what devices they targeted.
“These zero-day and exploit brokers tend to be unscrupulous,” says Cole. “They sell to the highest bidder and they double dip. Many don’t have exclusivity arrangements. That’s very likely what happened here.”
“One of these tools ended up in the hands of a non-Western exploit broker, and they sold it to whoever was willing to pay,” Cole concludes. “The genie is out of the bottle.”
Tech
Apple’s New MacBook Air and MacBook Pro Have New Chips, More Storage, and Higher Prices
Alongside its price-friendly iPhone 17e and M4 iPad Air yesterday, Apple just announced a few updates to the MacBook Pro, MacBook Air, and its rarely-refreshed desktop display line.
The MacBook Air has now been updated to the latest M5 chip. It’s a fairly modest upgrade, but it brings it up to speed with Apple’s latest processor that debuted in the MacBook Pro last fall. There are no other major hardware changes—it now comes with 512 GB of starting storage with “faster SSD technology”—but you can still get the Air in either a 13- or 15-inch screen size.
This laptop also features Apple’s N1 wireless chip, which includes Wi-Fi 7 and Bluetooth 6 for the latest connectivity standards. It still comes with the standard 16 GB of RAM, and sadly, there’s a $100 price bump to account for the extra storage. It now starts at $1,099 for the 13-inch model and $1,299 for the 15-inch model. Apple says you can preorder it tomorrow, with sales kicking off on March 11.
More interestingly, Apple is expanding the M5 chip series with the M5 Pro and M5 Max, now available in the 14-inch and 16-inch MacBook Pro. Like previous generations of Apple silicon, the “Pro” and “Max” configurations add significantly improved multi-core CPU and graphics performance.
The M5 Pro and M5 Max can be configured with up to 18 CPU cores (12 performance cores and 6 “super” cores), up from 16 on the M4 Max. The M5 Pro can scale up to 20 GPU cores, while the M5 Max extends up to 40 GPU cores. Thanks to higher memory bandwidth, more efficient Neural Engine, and improved GPU architecture, Apple says the M5 Pro and M5 Max have “over 4X the peak CPU compute for AI” compared to the last generation and offer 20 percent better GPU performance.
The new MacBook Pros don’t include any other hardware changes; things have stayed largely the same since 2021—same port selection, Mini-LED display, speakers, and webcam. Even the claimed 24-hour battery life hasn’t changed from the M4 models, which came out in late 2024. Interestingly, as recently as last week, Bloomberg reported that Apple plans to launch a more significant update to the MacBook Pro later this fall, which will reportedly debut the M6 chip, an OLED touchscreen, and a thinner chassis.
Like the MacBook Air, all versions of the M5 Pro or M5 Max MacBook Pros come with twice the storage and a slightly higher starting price. Coming with 1 TB, the 14-inch M5 Pro now starts at $2,199, and the 16-inch model at $2,699. That’s $200 more than last year’s machines. Meanwhile, M5 Max prices start at $3,599.
Tech
National Grid, Nebius and Emerald hail datacentre power throttling | Computer Weekly
National Grid has carried out the first trial of flexible electricity usage by a UK datacentre, in conjunction with operator Nebius. The trial used artificial intelligence (AI)-powered datacentre management software from Emerald AI’s software on a bank of 96 Nvidia Blackwell Ultra high-performance graphics processing units (GPUs) at a Nebius datacentre near London.
Over five days in December 2025, more than 200 real-time simulated “grid events” were sent to the site to test the Emerald software’s ability to dynamically adjust the datacentre’s power consumption.
Emerald AI’s platform was able to adjust power use to the requested level and cut demand by up to 40% while critical workloads ran as normal.
Key results included successfully reacting to spikes in demand during half time at football matches, followed by load-reduction requests for up to 10 hours that demonstrated an ability to help the grid navigate periods of low wind or extreme heat, and simulated a system stress event that saw it shed 30% of load in 30 seconds to help maintain grid resilience.
According to the partners involved in the trial, such capabilities could enable AI datacentres to add more than 2GW of capacity back to the grid when needed.
The aim is that AI datacentres can avoid being simply a source of electricity constraint to being more controllable in relation to the electricity grid, by managing peaks, making better use of existing infrastructure, and supporting the connection of different sources of energy to the grid.
“Most electric networks, most electric power systems, operate with probably 30% of capacity in place a year; there’s lots of capacity in the system, it’s a small number of hours a year when we’re at peak,” said Steve Smith, president of National Grid Partners, speaking at the Economist Impact Sustainability Week event in London.
“So, the trick is how you do it,” said Smith. “Because if you can throw more electrons at a fixed-cost system, you don’t need to put more infrastructure in, and the rates come down for everyone else.
“If you’re doing a small number of hours and you’re stretched, if we say, can you actually moderate your load when we need you to, then we don’t need to build lots more capacity.”
Also speaking at the Sustainability Week event, Varun Sivaram, chief executive of Emerald AI, said the trial showed that AI hardware at the Nebius datacentre could consume energy flexibly at a moment’s notice.
“When we got the signal in the middle of the night, we were able to reduce power within 30 seconds by over a third,” said Sivaram. “That’s also going to be the case with renewable energy, when there’s low wind, for eight hours, and the AI factory can reduce its consumption in such a way that we protect the critical workloads that run at 100% throughput.”
Sivaram explained that there are three ways to achieve flexibility of power consumption for AI workloads. The first is to slow some down or pause them. “Maybe a fine-tuning model run that doesn’t need to finish right this second, but it can be delayed by an hour,” he suggested.
The second way, he said, is by moving AI workloads. “You expect your answer from AI pretty soon, but we may be able to move it, as we did with a move between two different Oracle datacentres at the rate of 10 milliseconds of latency. There is a little bit of a latency penalty, but not relevant for that workload,” said Sivaram.
The third way, he said, is to monitor the datacentre to achieve flexibility. Here, Emerald operates as software intelligence to operate AI workloads – that can include by tagging them as different priorities – in an optimal way to give the grid what it needs while protecting the integrity of the workloads for the user.
-
Politics5 days agoWhat are Iran’s ballistic missile capabilities?
-
Politics6 days agoUS arrests ex-Air Force pilot for ‘training’ Chinese military
-
Business7 days agoHouseholds set for lower energy bills amid price cap shake-up
-
Sports1 week agoTop 50 USMNT players of 2026, ranked by club form: USMNT Player Performance Index returns
-
Sports6 days agoSri Lanka’s Shanaka says constant criticism has affected players’ mental health
-
Business7 days agoLucid widely misses earnings expectations, forecasts continued EV growth in 2026
-
Fashion5 days agoPolicy easing drives Argentina’s garment import surge in 2025
-
Fashion5 days agoTexwin Spinning showcasing premium cotton yarn range at VIATT 2026
