Connect with us

Tech

Storage data management tools: What they do and what’s available | Computer Weekly

Published

on

Storage data management tools: What they do and what’s available | Computer Weekly


Pure Storage’s recent launch of its Enterprise Data Cloud reignited debate around storage and data management.

Pure claims its EDC addresses the management of growing volumes of data in a complex regulatory environment, and the demands storage faces from artificial intelligence (AI) workloads.

The idea of a single management layer for storage is not new. Pure is not the only supplier looking to automate storage provision and management, where storage comes in “fleets”, and data management and governance take place across local, cloud and hybrid infrastructure.

But while analysts agree Pure has a technical edge for now, most suppliers offer tools that work across on-premise and cloud technologies with the aim of reducing storage management overheads through automation and AI.

Analyst GigaOm, for example, rates Pure Storage as a leader in data pipeline support, especially for demanding AI deployments, alongside Hitachi Vantara, HPE, IBM, NetApp and Dell Technologies.

“Adopting high-performance storage optimised for AI workloads is a strategic business imperative, not merely a technical upgrade,” says Whit Walters, field chief technology officer at GigaOm.

From storage to data management

AI’s demands for vast amounts of data has pushed chief information officers (CIOs) and suppliers to look beyond technical infrastructure management of storage, and to a wider concept of data management.

This includes managing conventional metrics, such as capacity, performance and availability, and routine tasks such as provisioning and backup, to issues such as data location for compliance and ransomware protection.

At a basic level, CIOs need to control all of a supplier’s products from a single control plane, from on-premise to the cloud. This includes day-to-day tasks like provisioning, data migration and upgrades, as well as robust monitoring. Ideally, data management should integrate with the supplier’s as-a-service tools, too.

But all this becomes harder as data volumes and performance requirements increase.

“There is a growing challenge with managing enterprise infrastructure at scale,” explains Simon Robinson, principal analyst at Enterprise Storage Group. “This is not a new problem. Infrastructure and operations teams spend too much time instrumenting, fine tuning, provisioning and managing capacity for their enterprise workload. Storage is still pretty onerous in that respect.”

Improvements in storage management, he says, have mostly been technical, such as thin provisioning, and at the array level. This makes it harder to scale systems, and fails to account for integration with the cloud. 

“Now the control plane needs to extend across the on-premise environment and the public cloud,” says Robinson. “That is a really difficult problem to solve.”

Meanwhile, data and storage management tools rarely work across rival supplier platforms. Even though platform-neutral storage management has been tried, suppliers reverted to their own tools, with extensions into cloud environments. 

The argument is that single supplier tools offer a performance advantage that outweighs the drawbacks. 

“Going back 10 years, the goal was to consistently manage a heterogeneous vendor environment,” says Robinson. “That hasn’t materialised. The trade off with all of these approaches is that you are going to get the best results if you standardise around a particular vendor’s systems.” 

Some supplier offerings, such as IBM Storage Virtualize, provide multi-supplier support. But most, such as Pure’s EDC, assume IT leaders will trade compatibility for performance.

Here, we list some key data management features of the main data storage suppliers.

Dell Technologies

Dell’s PowerScale technology provides a scale-out architecture, supporting management of local and cloud storage from the same interface.

Dell includes data management for AI and unstructured data, through DataIQ (for unstructured data) and CloudIQ (for cloud).

DataIQ works across Dell EMC PowerScale and Isilon hardware, as well as S3 compatible cloud storage. Though Apex, Dell also provides a platform for multi-cloud management, although it is not specific to storage.

HPE

HPE says its Alletra Storage product gives a “cloud experience” for workloads locally or in the cloud. Its Greenlake platform provides as-a-service storage across on-premise, hybrid and cloud.

Zerto offers data protection across hybrid environments. Alongside this, HPE’s Data Management Framework 7 provides data management tools across high-performance and AI storage, including tiering and automated file movement.

Huawei

Huawei’s data management engine (DME) provides provisioning, lifecycle management, alerting and anomaly detection. It also supports multi-cloud operations, and uses AI to predict system risks, through DME IQ.

DME supports Huawei’s own arrays and its FusionStorage, as well as some support for third-party hardware and hosts such as ESXi.

IBM

IBM has a wide range of storage and data management capabilities, split across a range of tools. Storage Virtualize is a long-established tool able to manage hardware in multi-supplier environments.

IBM Storage Insights Pro is subscription-based, and provides inventory, capacity and performance management for IBM and non-IBM block storage.

IBM Storage Scale provides high-performance data management, while IBM Spectrum Control delivers monitoring and analytics across multiple suppliers on-premise and in the cloud.

NetApp

NetApp has a range of storage and data management capabilities, including through its Ontap storage operating system, its StorageGrid multi-cloud technology and its Keystone as-a-service offering.

Keystone can control storage across on-premise and the cloud, and includes governance, compliance and ransomware protection, as well as deployment and management tools. BlueXP allows users to control storage and data services across local and cloud systems.

Hitachi Vantara

Hitachi Vantara’s VSP One offers a single data plane to integrate data and simplify management across on-premise and cloud. It supports block, file and object, as well as software-defined storage (SDS) and, unusually, support for mainframes.

VSP One SDS can run on third-party hardware, as well as on Amazon’s cloud. VSP 360 provides cloud orchestration as well as fleet management; Everflex provides storage-as-a-service.

Pure Storage

Enterprise Data Cloud allows customers to manage data across a “storage cloud”, regardless of the location of physical storage. This allows customers to focus on managing data, it says.

It also allows any Pure array to work as an endpoint for the fleet. EDC is made up of Pure’s hardware layer, its cloud-based Pure1 storage management and optimisation platform, and its Pure Fusion control plane for fleet management. 



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Carbon opportunities highlighted in Australia’s utilities sector

Published

on

Carbon opportunities highlighted in Australia’s utilities sector


Credit: Unsplash/CC0 Public Domain

Australia’s utility sector accounts for some 43.1% of the country’s carbon footprint, and some 37.2% of its direct emissions, new research from Edith Cowan University (ECU) has revealed.

Dr. Soheil Kazemian, from the ECU School of Business and Law, said the utilities sector included , transmission and distribution, gas supply, water supply and waste collection and treatment.

Electricity generation and transmission were identified as the most significant contributors within the utilities sector, with commercial services and manufacturing emerging as substantial sources of embodied within the sector.

The research, published in the Management of Environmental Quality: An International Journal, revealed that 71% of embodied emissions were attributed to electricity transmission, distribution, on-selling electricity, and electricity market operation. Electricity generation accounted for a further 15%, while gas supply accounted for 5%, water supply for 4%, and waste services and treatment for the remaining 5% of embodied emissions in the sector.

“The study highlights electricity transmission and generation as the subsectors with the highest potential for adopting low-carbon technologies. By pinpointing emission hotspots and offering detailed sectoral disaggregation, the results of the research provide actionable insights for prioritizing investment in emissions reduction strategies, advancing Australia’s sustainability goals and supporting global climate change mitigation,” Dr. Kazemian said.

He said that as with any other business, the pressure to reduce the carbon emissions footprint of the utility sector would need to originate from the consumer sector.

Unlike other sectors, however, increased investment into the utilities sector is likely to result in a smaller carbon footprint.

“This is a major difference between the different sectors in Australia. If you invest more in mining, that means the from that industry would increase, and the same can be said for manufacturing as the investment would result in expanded business.

“While new infrastructure development can generate temporary increases in emissions for the utility sector during construction, the long-term impact depends on where those dollars are spent. Investment in or efficient delivery networks can significantly cut emissions, whereas continuing to fund carbon-intensive energy sources risks locking in higher emissions for decades to come.

“This complexity highlights a critical point that meaningful decarbonization will depend not only on policy or technology, but also on consumer choices. When households and businesses demand cleaner energy, utilities are more likely to channel investment into low-carbon solutions. By consciously choosing renewable energy options and supporting sustainable providers, consumers can send a powerful market signal that accelerates the transition to a cleaner grid,” Dr. Kazemian said.

More information:
Soheil Kazemian et al, Determining the carbon footprint of Australia’s electricity, gas, water and waste services sector, Management of Environmental Quality: An International Journal (2025). DOI: 10.1108/meq-07-2024-0311

Citation:
Carbon opportunities highlighted in Australia’s utilities sector (2025, October 15)
retrieved 15 October 2025
from https://techxplore.com/news/2025-10-carbon-opportunities-highlighted-australia-sector.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Continue Reading

Tech

AI-ready companies turning network pilots into profit | Computer Weekly

Published

on

AI-ready companies turning network pilots into profit | Computer Weekly


While the AI genie is out of the bottle for organisations for all sizes, only 13% of businesses are fully prepared for it, with those ready as much as four times more likely to move pilots into production and 50% more likely to see measurable value, according to a study by Cisco.

The data comes from the Cisco AI readiness index 2025, a global study, now in its third year, based on a double-blind survey of 8,000 senior IT and business leaders responsible for AI strategy at organisations with more than 500 employees across 26 industries across 30 markets.

Cisco added that the combination of foresight and foundation is delivering real, tangible results at a time when two major forces are starting to reshape the landscape: AI agents, which raise the bar for scale, security and governance; and AI infrastructure debt, the early warning signs of hidden bottlenecks that threaten to erode long-term value.

Regarding AI agents, the survey found ambition was outpacing readiness. Overall, 83% of organisations planned to deploy AI agents, and nearly 40% expected them to work alongside employees within a year. But the study discovered that, for majority of these companies, AI agents were exposing weak foundations – that is, systems that can barely handle reactive, task-based AI, let alone AI systems that act autonomously and learn continuously. More than half (54%) of respondents said their networks can’t scale for complexity or data volume and just 15% describe their networks as flexible or adaptable.

AI infrastructure debt was called the modern evolution of technical and digital debt that once held back digital transformation. Moreover, the survey regarded it as “the silent accumulation of compromises, deferred upgrades, and underfunded architecture that erodes the value of AI over time”. Some 62% of firms expect workloads to rise by over 30% within three years, 64% struggle to centralise data, only 26% said that they have robust GPU capacity and fewer than one in three could detect or prevent AI-specific threats.

Among the topline results from the report were that “small but consistent” group of companies surveyed – falling into the category of pacesetters, and making up about 13% of organisations for the past three years – were outperforming their peers across every measure of AI value.

Cisco noted that the pacesetters’ sustained advantage indicated a new form of resilience: a disciplined, system-level approach that balances strategic drivers with the data and network infrastructure needed to keep pace with AI’s accelerating evolution. It added that such firms were already architecting for the future, with 98% designing their networks for the growth, scale and complexity of AI, compared with 46% overall.

The research outlined a pattern among companies delivering real returns: they make AI part of the business, not a side project; they build infrastructure that’s ready to grow; they move pilots into production; they measure what matters; and they turn security into strength.

Virtually all pacesetters (99%) were found to have a defined AI roadmap (vs 58% overall), and 91% (vs 35%) had a change-management plan. Budgets match intent, with 79% making AI the top investment priority (vs 24%), and 96% with short- and long-term funding strategies (vs 43%). The study noted that such firms architect for the always-on AI era. Some 71% of pacesetters said that their networks were fully flexible and can scale instantly for any AI project (vs 15% overall), and 77% are investing in new datacentre capacity within the next 12 months (vs 43%).

Just over three-fifths had what was defined as a “mature, repeatable” innovation process for generating and scaling AI use cases (versus 13% overall), and three-quarters (77%) had already finalised those use cases (versus 18%). Some 95% track the impact of their AI investments – three times higher than others – and 71% were confident their use cases will generate new revenue streams, more than double the overall average. Meanwhile, 87% were highly aware of AI-specific threats (versus 42% overall), 62% integrated AI into their security and identity systems (versus 29%), and 75% were fully equipped to control and secure AI agents (versus 31%).

The result of this approach, said Cisco, was that pacesetters achieve more widespread results than their peers because of this approach, with 90% reporting gains in profitability, productivity and innovation, compared with around 60% overall.

Commenting on the results from the survey, Cisco president and chief product officer Jeetu Patel stated that the AI readiness index makes one thing clear: AI doesn’t fail – readiness fails, adding: “The most AI-ready organisations – the pacesetters from our research – prove it. They’re four times more likely to move pilots into production and 50% more likely to realise measurable value. So, with more than 80% of organisations we surveyed about to deploy AI agents, these new findings confirm readiness, discipline and action are key to unlocking value.”



Source link

Continue Reading

Tech

Patch Tuesday: Windows 10 end of life pain for IT departments | Computer Weekly

Published

on

Patch Tuesday: Windows 10 end of life pain for IT departments | Computer Weekly


The day Microsoft officially ended support for Windows 10 has coincided with a Patch Tuesday update, with several zero-day flaws that attackers could exploit to target the older Windows operating system.

Among these is CVE-2025-24990, which covers a legacy device driver that Microsoft has removed entirely from Windows. “The active exploitation of CVE-2025-24990 in the Agere Modem driver (ltmdm64.sys) shows the security risks of maintaining legacy components within modern operating systems,” warned Ben McCarthy, lead cyber security engineer at Immersive.

“This driver, which supports hardware from the late 1990s and early 2000s, predates current secure development practices and has remained largely unchanged for years,” he said. “Kernel-mode drivers operate with the highest system privileges, making them a primary target for attackers seeking to escalate their access.”

McCarthy said threat actors are using this vulnerability as a second stage for their operations. “The attack chain typically begins with the actor gaining an initial foothold on a target system through common methods like a phishing campaign, credential theft, or by exploiting a different vulnerability in a public-facing application,” he said.

McCarthy added that Microsoft’s decision to remove the driver entirely, rather than issue a patch, is a direct response to the risks associated with modifying unsupported, third-party legacy code. “Attempts to patch such a component can be unreliable, potentially introducing system instability or failing to address the root cause of the vulnerability completely,” he said.

In removing the driver from the Windows operating system, McCarthy said Microsoft has prioritised reducing the attack surface over absolute backward compatibility. “By removing the vulnerable and obsolete component, the potential for this specific exploit is zero,” he said. “The security risk presented by the driver was determined to be greater than the requirement to continue supporting the outdated hardware it serves.”

McCarthy said this approach demonstrates that an effective security strategy must include the lifecycle management of old code, where removal is often more definitive and secure than patching.

Another zero-day flaw that is being patched concerns the Trusted Platform Module from the Trusted Computing Group (TCG). Adam Barnett, lead software engineer at Rapid7, noted that the CVE-2025-2884 flaw concerns TPM 2.0 reference implementation, which, under normal circumstances, is likely to be replicated in the downstream implementation by each manufacturer.

“Microsoft is treating this as a zero-day despite the curious circumstance that Microsoft is a founder member of TCG, and thus presumably privy to the discovery before its publication,” he said. “Windows 11 and newer versions of Windows Server receive patches. In place of patches, admins for older Windows products such as Windows 10 and Server 2019 receive another implicit reminder that Microsoft would strongly prefer that everyone upgrade.”

One of the patches classified as “critical” has such a profound impact that some security experts advise IT departments to patch immediately. McCarthy warned that the CVE-2025-49708 critical vulnerability in the Microsoft Graphics Component, although classed as an “elevation of privilege” security issue, has a severe real-world impact.

“It is a full virtual machine [VM] escape,” he said. “This flaw, with a CVSS score of 9.9, completely shatters the security boundary between a guest virtual machine and its host operating system.”

McCarthy urged organisations to prioritise patching this vulnerability because it invalidates the core security promise of virtualisation.

“A successful exploit means an attacker who gains even low-privilege access to a single, non-critical guest VM can break out and execute code with system privileges directly on the underlying host server,” he said. “This failure of isolation means the attacker can then access, manipulate or destroy data on every other VM running on that same host, including mission-critical domain controllers, databases or production applications.”



Source link

Continue Reading

Trending