Tech
What are the storage requirements for AI training and inference? | Computer Weekly
Despite ongoing speculation around an investment bubble that may be set to burst, artificial intelligence (AI) technology is here to stay. And while an over-inflated market may exist at the level of the suppliers, AI is well-developed and has a firm foothold among organisations of all sizes.
But AI workloads place specific demands on IT infrastructure and on storage in particular. Data volumes can start big and then balloon, in particular during training phases as data is vectorised and checkpoints are created. Meanwhile, data must be curated, gathered and managed throughout its lifecycle.
In this article, we look at the key characteristics of AI workloads, the particular demands of training and inference on storage I/O, throughput and capacity, whether to choose object or file storage, and the storage requirements of agentic AI.
What are the key characteristics of AI workloads?
AI workloads can be broadly categorised into two key stages – training and inference.
During training, processing focuses on what is effectively pattern recognition. Large volumes of data are examined by an algorithm – likely part of a deep learning framework like TensorFlow or PyTorch – that aims to recognise features within the data.
This could be visual elements in an image or particular words or patterns of words within documents. These features, which might fall under the broad categories of “a cat” or “litigation”, for example, are given values and stored in a vector database.
The assigned values provide for further detail. So, for example “a tortoiseshell cat”, would comprise discrete values for “cat” and “tortoiseshell”, that make up the whole concept and allow comparison and calculation between images.
Once the AI system is trained on its data, it can then be used for inference – literally, to infer a result from production data that can be put to use for the organisation.
So, for example, we may have an animal tracking camera and we want it to alert us when a tortoiseshell cat crosses our garden. To do that it would infer the presence or not of a cat and whether it is tortoiseshell by reference to the dataset built during the training described above.
But, while AI processing falls into these two broad categories, it is not necessarily so clear cut in real life. It will always be the case that training will be done on an initial dataset. But after that it is likely that while inference is an ongoing process, training also becomes perpetual as new data is ingested and new inference results from it.
So, to labour the example, our cat-garden-camera system may record new cats of unknown types and begin to categorise their features and add them to the model.
What are the key impacts on data storage of AI processing?
At the heart of AI hardware are specialised chips called graphics processing units (GPUs). These do the grunt processing work of training and are incredibly powerful, costly and often difficult to procure. For these reasons their utilisation rates are a major operational IT consideration – storage must be able to handle their I/O demands so they are optimally used.
Therefore, data storage that feeds GPUs during training must be fast, so it’s almost certainly going to be built with flash storage arrays.
Another key consideration is capacity. That’s because AI datasets can start big and get much bigger. As datasets undergo training, the conversion of raw information into vector data can see data volumes expand by up to 10 times.
Also, during training, checkpointing is carried out at regular intervals, often after every “epoch” or pass through the training data, or after changes are made to parameters.
Checkpoints are similar to snapshots, and allow training to be rolled back to a point in time if something goes wrong so that existing processing does not go to waste. Checkpointing can add significant data volume to storage requirements.
So, sufficient storage capacity must be available, and will often need to scale rapidly.
What are the key impacts of AI processing on I/O and capacity in data storage?
The I/O demands of AI processing on storage are huge. It is often the case that model data in use will just not fit into a single GPU memory and so is parallelised across many of them.
Also, AI workloads and I/O differ significantly between training and inference. As we’ve seen, the massive parallel processing involved in training requires low latency and high throughput.
While low latency is a universal requirement during training, throughput demands may differ depending on the deep learning framework used. PyTorch, for example, stores model data as a large number of small files while TensorFlow uses a smaller number of large model files.
The model used can also impact capacity requirements. TensorFlow checkpointing tends towards larger file sizes, plus dependent data states and metadata, while PyTorch checkpointing can be more lightweight. TensorFlow deployments tend to have a larger storage footprint generally.
If the model is parallelised across numerous GPUs this has an effect on checkpoint writes and restores that mean storage I/O must be up to the job.
Does AI processing prefer file or object storage?
While AI infrastructure isn’t necessarily tied to one or other storage access method, object storage has a lot going for it.
Most enterprise data is unstructured data and exists at scale, and it is often what AI has to work with. Object storage is supremely well suited to unstructured data because of its ability to scale. It also comes with rich metadata capabilities that can help data discovery and classification before AI processing begins in earnest.
File storage stores data in a tree-like hierarchy of files and folders. That can become unwieldy to access at scale. Object storage, by contrast, stores data in a “flat” structure, by unique identifier, with rich metadata. It can mimic file and folder-like structures by addition of metadata labels, which many will be familiar with in cloud-based systems such as Google Drive, Microsoft OneDrive and so on.
Object storage can, however, be slow to access and lacks file-locking capability, though this is likely to be of less concern for AI workloads.
What impact will agentic AI have on storage infrastructure?
Agentic AI uses autonomous AI agents that can carry out specific tasks without human oversight. They are tasked with autonomous decision-making within specific, predetermined boundaries.
Examples would include the use of agents in IT security to scan for threats and take action without human involvement, to spot and initiate actions in a supply chain, or in a call centre to analyse customer sentiment, review order history and respond to customer needs.
Agentic AI is largely an inference phase phenomenon so compute infrastructure will not need to be up to training-type workloads. Having said that, agentic AI agents will potentially access multiple data sources across on-premises systems and the cloud. That will cover the range of potential types of storage in terms of performance.
But, to work at its best, agentic AI will need high-performance, enterprise-class storage that can handle a wide variety of data types with low latency and with the ability to scale rapidly. That’s not to say datasets in less performant storage cannot form part of the agentic infrastructure. But if you want your agents to work at their best you’ll need to provide the best storage you can.
Tech
Americans Are Increasingly Convinced That Aliens Have Visited Earth
Americans are becoming more open to the idea that aliens have visited Earth, according to a series of polls that show belief in alien visitation has been steadily on the rise since 2012.
Almost half—47 percent—of Americans say they think aliens have definitely or probably visited Earth at some point in time, according to a new poll from YouGov conducted in November 2025 that involved 1,114 adult participants. That percentage is up from roughly a third (36 percent) of Americans polled in 2012 by Kelton Research, with the exact same sample size. Gallup published polls on this question in 2019 and 2021 that likewise show an upward trend.
Moreover, people seem to be getting off the fence on this issue, one way or the other. Just 16 percent of Americans said they were unsure if aliens had visited Earth in the new poll, down from 48 percent who were unsure in 2012. Meanwhile, even as belief in alien visitation has risen, so has doubt: The new poll shows that 37 percent of Americans said Earth likely hasn’t been visited by aliens, more than double the 17 percent logged in 2012.
It’s impossible to know exactly why Americans have become more receptive to alien visitation from these polls alone; they only include raw statistics, and lack granular details about the specific motivations for the participants’ responses.
“It’s important to note that this is a poll about belief,” says Susan Lepselter, an author and associate professor of anthropology and American Studies at Indiana University who has written extensively on alien beliefs and UFO experiences. “It’s not a poll about experience, contact, feelings—nothing like that.”
“We don’t know what their engagement is; we don’t know if their belief has been life-changing,” she adds. “We just know one thing, which is that the statistics have moved from one set of beliefs to another.”
Of course, it’s still possible—and let’s be real, fun—to speculate on the drivers of the trend. One obvious culprit is a new posture from institutional news sources, such as the US government and legacy media, which have finally started taking unidentified anomalous phenomena (UAP) seriously.
This shift began with the release of mysterious Pentagon UAP videos by The New York Times in 2017, and has since been accelerated by spate of Congressional hearings, and a NASA independent study on UAP. The newly released documentary The Age of Disclosure, which features claims by former military officials that the US government has covered up evidence of aliens visiting Earth, has supercharged the legitimacy to this once marginalized topic.
Tech
Get Up to 50% Off Select Items With These Ring Camera Deals
If you’re a fan of Amazon’s ecosystem, whether that’s asking your Alexa speaker to tell you about the weather or compulsively checking the video feed from your Ring doorbell, then it makes sense to expand and build onto the system. It’s always easier to keep to one ecosystem as much as you can with smart home gear, letting you stick to a single app and single subscription if you decide to invest in one.
While we’ve liked Ring’s cameras and home security products fine enough, they’re hard to recommend at the top of our guides since Ring is reintroducing a policy to enable local law enforcement to request footage directly from Ring users. It’s up to you if that’s something you want to invest in, and if you already have Ring products, it might make the most sense to continue adding onto that ecosystem than diving into a new one.
No matter the reason, if you’re looking to add a Ring product to your home, don’t get one without using our Ring coupon codes to get it for a better price.
50% Off Ring Cameras, Doorbells, and Outdoor Cameras
Ring is running a deal all month long with up to 50% off different products and bundles. You can get all kinds of Ring cameras and security accessories for a variety of discounts, from Ring’s video doorbell to indoor and outdoor cameras.
Save $150 on Wired Doorbell Pro and Floodlight Cam
If you’re looking for an outdoor combination, you can get both Ring’s Starter Pro Kit, which includes the Wired Doorbell Pro and Floodlight Cam, for $150 off the set. It’s a great option if you want to get a camera feed both at your doorstep and over your garage.
Bundle and Save on Ring Whole Home Basic Kit
Looking to deck out your whole home? Ring’s Whole Home Basic Kit is also discounted for $59 off. It includes Ring’s Outdoor Cam Plus Battery, Battery Doorbell, and the Alarm Security Kit, so you can get everything from video surveillance around the outside of your home and sensors to pair with the alarm system for inside of it.
Ring has a variety of subscription plans, which you’ll want since there’s no option to locally store your video footage. That means in order to play any video back to see what set off the camera or who was at the door, you’ll need one of these plans. Here’s a quick breakdown. Basic Ring Plan: Get the basics with video event playback and smart notifications for one camera. $5 per month or $50 per year. Standard Plan: All the core Ring experience with enhanced features for all your devices. $10 per month, or $100 per year. Premium Plan: Ring home the best of the best with our most advanced AI and recording features. $20 per month or $200 per year.
Stay Connected With $29 Off Pet Basic Kit + Pet Tag
Ring has a pet package you can get for a discount, too. You’ll get both Ring’s Indoor Cam and the Pet Tag, which has a QR code that lets anyone who finds your pet scan it and get your information to contact you. It’s 50% off right now, so if you’re looking for new tags and a camera to keep an eye on your favorite furry companion, this is your moment.
Tech
UK mobile improves but digital divides persist | Computer Weekly
Mobile connectivity across the UK is becoming faster and more responsive on average; a marked gap still persists between the quality of experience in urban and rural areas; and the gap between the best and worst-performing local authorities remains significant, according to research from Ookla.
The analyst’s Speedtest Intelligence report for 2025 takes an overview of mobile network performance across the UK, focusing on outcomes at local authority level and how those outcomes have changed over time.
The study was based on millions of samples from mobile devices connected to a cellular network, comparing results from Q1–Q3 2025 with the same period in 2024. For each local authority, the report considered not only typical speeds, but also the experience of slower connections, and the relationship between population density and mobile outcomes. At UK and country (nation) level, it drew on national aggregate metrics (2025 to date) for the UK, England, Scotland, Wales and Northern Ireland.
Fundamentally, the research found that population density correlates strongly with better outcomes, and that practically, the findings illuminate the urban-rural digital divide, showing that where you live in the UK largely dictates your mobile experience.
Analysis of local authority outcomes revealed what Ookla called the “stark” extent of regional variation in and across nations in the UK. Despite the general upward shift in the overall local authority distribution over the past year across key mobile performance indicators, the range remains large and many rural local authority areas are still stuck with not-spots despite the progress of the government’s shared rural network (SRN) scheme. Areas that were strong performers in 2024 generally remained strong, and many of the weakest authorities in 2024 still sit near the bottom of the distribution in 2025.
On a country level, UK mobile performance improved notably between 2024 and 2025, with the national median download speed rising from approximately 55.02Mbps to 63.03Mbps. This represented a year-on-year increase of around 15%. Median upload speeds inched up from 7.80Mbps to 8.21Mbps, while median latency improved marginally from 52ms to 50ms.
England and Northern Ireland saw the strongest gains, while Wales remained the slowest nation and Scotland’s median slipped from 49.13 to 46.05Mbps despite improvements in several local authorities. Overall, though, the UK rates badly compared with European peers such as Germany and the Republic of Ireland.
Drilling deeper, the study showed that the gap between local authorities remained stark. In Q1–Q3 2025, median speeds ranged from just over 10Mbps in the Shetland Islands to just over 100Mbps in Leicester. Around 28% of local authorities had fewer than 60% of test samples meeting a 25Mbps download threshold, indicating persistently poor connectivity for many in the UK.
Including the aforementioned Leicester, top performers included Nottingham, Derby, Bridge of Don, Thurrock and Stoke-on-Trent. These areas typically combine median download speeds in the mid-80s to 100Mbps, roughly three-quarters or more of samples reaching 25Mbps, and relatively strong results even in the slowest 10th percentile (generally around 8–11Mbps).
In addition to the Shetland Islands, the country’s weakest performers included the Isle of Anglesey, Fermanagh and Omagh, Denbighshire, Pembrokeshire, Orkney, and Cornwall. These areas have median download speeds mostly in the mid-teens to low-20s – excluding the Shetland Islands – with less than half of samples reaching 25Mbps and 10th-percentile speeds typically in the 1.5–3Mbps range, highlighting large not-spots for a significant share of users there.
Looking at the companies driving the industry, the study noted that heavy capital spending by the UK’s operators was driving improved outcomes. It added that the UK remains one of only a handful of countries in Europe and globally where at least three operators have “aggressively” deployed 5G standalone across a significant footprint.
Virgin Media O2 has already reported 70% population coverage and BT/EE boasts a similar level. VodafoneThree has committed to invest £11bn in its UK network over the next decade, including £1.3bn of capex in year one.
-
Business6 days agoHitting The ‘High Notes’ In Ties: Nepal Set To Lift Ban On Indian Bills Above ₹100
-
Politics1 week agoTrump launches gold card programme for expedited visas with a $1m price tag
-
Business1 week agoRivian turns to AI, autonomy to woo investors as EV sales stall
-
Sports1 week agoPolice detain Michigan head football coach Sherrone Moore after firing, salacious details emerge: report
-
Fashion1 week agoTommy Hilfiger appoints Sergio Pérez as global menswear ambassador
-
Business1 week agoCoca-Cola taps COO Henrique Braun to replace James Quincey as CEO in 2026
-
Tech1 week agoGoogle DeepMind partners with UK government to deliver AI | Computer Weekly
-
Sports1 week agoU.S. House passes bill to combat stadium drones
