Research from leading UK connectivity provider BT has concluded with the clear message that now is the time to act on upgrading the UK’s critical national infrastructure (CNI) sectors from outdated analogue networks to digital infrastructure, with the financial, societal and environmental benefits of such a move offering the potential to deliver a £3bn net benefit to the economy by 2040.
The study, conducted by Assembly Research, evaluated the costs, risks and potential gains from digital migration across energy, water, health (NHS), emergency services and local government. It accounted for the direct cost of upgrading, as well as the rising expense of maintaining legacy systems like the public switched telephone network (PSTN) and the 2G mobile network – both decades old and increasingly challenging to support. Data from UK comms regulator Ofcom shows that resilience incidents on the PSTN have risen by 45%, underscoring the urgency of change.
The UK’s transition to digital connectivity is a major national infrastructure programme endorsed by Ofcom and the government. The PTSN will be fully retired in January 2027, with businesses and public services urged to complete their migrations by the end of 2025 to avoid last-minute disruption. In 2024 alone, BT migrated nearly 300,000 legacy PSTN business lines. Yet many CNI providers in the UK still rely on ageing analogue systems for critical operations, while other countries are moving faster.
The research showed that digital network upgrades could transform a number of sectors – in particular the NHS, councils, ambulance services, emergency services, the environment and the energy sector – freeing up resources and preventing millions of unnecessary callouts.
For example, the study calculated that 600,000 NHS staff hours and 12 million council staff hours could be freed up, the equivalent of 6,500 staff working full-time for a year. It also estimated that up to 750,000 unnecessary ambulance trips could be prevented, avoiding more than 100 journeys every day. Emergency services are said to have the possibility to avoid 280,000 false fire service callouts by retiring outdated fire alarm systems.
Outside the health sector, the study hinted at £1.4bn in potential savings from improved resilience and demand forecasting in the energy sector, while possible environmental gains cited amounted to saving 3.42 megatonnes of carbon emissions, equal to powering every home in Birmingham, the UK’s second largest city, for a year.
Putting a financial figure on these gains, the study noted that for the energy sector, digital networks could deliver improved resilience, help prevent outages and enable more accurate demand forecasting, translating to an estimated £1.4bn in savings. In the water sector, smarter network monitoring and reduced electricity usage could generate efficiencies worth £771m.
In addition, the study argued that local governments stand to gain £486m by modernising telecare systems and cutting the cost of maintaining ageing analogue equipment. In the NHS, digital transformation promises better call handling and more efficient emergency response. The study added that emergency services could see fewer false alarms and improved call management, enabling faster, more targeted responses.
Jon James, CEO of BT Business, said the results of the study sent a clear message that delaying the shift to digital carries a real cost to public services, the environment and the wider economy. “Legacy systems are becoming increasingly unreliable, and the case for action is urgent,” he noted. “BT is committed to guiding the UK’s critical national infrastructure sectors through this upgrade with the resilience and support they need.”
Matthew Howett, founder and CEO of Assembly Research, said: “For the first time, we’ve lifted the lid on legacy network migration, and worked to understand the scope and scale of how key UK industries are still relying on ageing fixed and mobile networks. Our research found that while the energy and water sectors are already well into their migrations, it’s vital that others follow to avoid growing costs and missed efficiencies.”
We have tested many other USB flash drives that did not make the cut. Here are a few that might be worth considering for some folks.
Photograph: Simon Hill
Kingston Dual Portable SSD (1 TB) for $229: With a snazzy metallic red body, this SSD disguised as a flash drive is very speedy, matching the stated 1,050 MB/s read and 950 MB/s write in my tests. It is USB 3.2 Gen 2 with a C jack at one end and an A jack at the other, both with removable covers. As much as I like this drive, which comes in 512-GB, 1-TB, and 2-TB models, it is on the pricey side.
Amazon Basics Flash Drive (128 GB) for $18: I like the grippy texture on the slider of this drive because it’s easy to open one-handed and locks in place securely. The loop at the top is perfect for a key ring, and it is lightweight. Performance was limited, as you might expect at this price, but it consistently exceeded the stated 130 MB/s read and 30 MB/s write speeds for larger files, though it only had 116 GB usable out of the box.
Buffalo External SSD-PUT Stick (500 GB) for $70: Another SSD in a flash drive body, this drive hit 450 MB/s in my tests and offers shock protection for falls. There’s also a sliding USB-A, and it comes with a USB-C adapter. It is pretty chunky for a flash drive, so you may find it blocks adjacent ports. You can also get reasonably priced 1- and 2-TB versions of this drive.
PNY Pro Elite V2 (256 GB) for $60: This sliding drive has a plastic cover to protect the USB-A plug, and was our compact pick for a while. It performed well (read and write speeds hovered around 415 MB/s and 425 MB/s) in my tests, and has an opening for a lanyard or keyring. I tested the 256-GB drive, but there are 512-GB and 1-terabyte models.
SanDisk Ultra Dual Drive Go (128 GB) for $29: This handy drive swivels to give you USB-C or USB-A, and comes in various sizes and some fun colors, but the lower capacity drives are slow (USB 3.1). You can get the 128 GB drive and up in USB 3.2 Gen 1 for up to 400 MB/s read and it’s a solid alternative to the PNY Duo above.
PNY Elite-X (128 GB) for $16: This super-compact, sliding drive has a USB-C 3.2 Gen 1 jack and a loop on the end to fit on a keyring. It worked fine but proved unremarkable in my tests (around 200 MB/s read, and 130 MB/s write).
Kingston IronKey Keypad 200 (16 GB) for $112: If you need a secure drive, Kingston’s IronKey boasts FIPS 140-3 certification, XTS-AES 256-bit encryption, and a special epoxy on its circuitry to make it impossible to remove components. On the downside, it is expensive, the keypad is fiddly, and 10 wrong entries wipe the drive.
Samsung Bar Plus (256 GB) for $52: An elegant, one-piece, curved design makes this drive easy to withdraw and there’s a loop so you can slip it onto a keyring. Test read speeds were just shy of 400 MB/s, with write speeds just over 100 MB/s, but the smaller drives (32 GB and 64 GB) are significantly slower. The Bar Plus is also a durable option, with Samsung claiming it is waterproof, shock-proof, temperature-proof, magnet-proof, and x-ray-proof.
Avoid These Flash Drives
Photograph: Simon Hill
Silicon Power DS72 Portable SSD (1 TB): This is a reasonable price for a 1-TB drive with USB 3.2 Gen 2 USB-A and USB-C connectors, and it consistently hit 450 MB/s read and write speeds in my tests (it can hit 1050 MB/s and 850 MB/s with the right gear). It got quite warm to the touch, but the reason I don’t recommend this drive is the stupid plastic connector covers. You have to bend them back, and they get in the way when you’re trying to insert the drive.
Verbatim Dual (64 GB): This teeny drive is cheap and has both USB-A and USB-C plugs, but I found write speeds were variable (60 MB/s for USB-C and 90 MB/s for USB-A) and read speeds were around 150 MB/s for both. There is a cover for the USB-A and a wee strap you can attach, but this drive is almost too small, and it proved awkward to insert and remove. It also comes in 16- or 32-gigabyte options.
How to Eject and Format Drives
It’s a good idea to format your USB flash drive before you start using it. You’ll usually be asked what format you want to use. Almost every device will recognize the FAT32 format, but it limits the individual file size to 4 GB. Go for exFAT if you have larger files. If you format a flash drive, it will completely wipe everything stored on it. Here’s how to do it manually:
On a Windows computer: Open File Explorer and look for your drive under This PC. Right-click on it, and select Format.
On a Mac: Type Disk Utility into the Search, or find it via Applications > Utilities. Select your drive from the list and click Erase at the top. Then you can rename and choose a format.
On a Chromebook: Open Files and right-click on your drive to choose Format device.
You’re probably familiar with warnings about removing a drive without ejecting it first. But there’s a genuine risk your data will be corrupted, so here’s how to do it properly:
On a Windows computer: You can click the Safely Remove Hardware notification icon in the system tray and choose the Eject option. If you prefer not to have to eject, type Device Manager into the search bar and click to open. Expand Disk Drives and right-click on your USB flash drive, choose Properties, Policies, and set to Quick Removal.
On a Mac: You will see an eject icon listed next to the drive name in Finder, or you can simply drag the flash drive image on your desktop to the trash.
On a Chromebook: Open Files and right-click on your drive, then select Eject device.
On an Android device: You can open and expand the USB notification to find an Eject option.
On an iPhone or iPad: There is no eject option. Ensure no data transfer is in progress before you pull it. It’s a good idea to close the Files app or whatever app you were using to transfer files.
How to Get the Most From Your USB Flash Drive
There are a few things to keep in mind when you’re shopping for USB flash drives, and we also have some tips for using them.
Capacity: To decide on the capacity of the storage device you need, first check the size of the folders or files you want to copy. Each USB drive in our guide has a stated capacity, but the usable storage will be slightly less than that, because the device’s firmware requires space.
Speed: USB standards are advancing all the time, and we recommend USB 3.0 as a minimum, though higher is better. While USB standards have different theoretical maximum speeds, it’s crucial to check the manufacturer’s stated read and write speeds for each drive. If you’re primarily transferring data, you’ll want to look for a drive with high write speeds. If you’re planning on launching software on a computer through the drive (like a video game), then you’ll want a model with high read speeds. Manufacturers will state average speeds, but most drives are much faster at transferring large files and tend to be far slower at transferring small files.
Compatibility: Many flash drives will work with any device with the relevant USB port, but check compatibility to avoid disappointment. If you want to use a drive with an Android device or something from the iPhone 16 range or later, it will require USB on-the-go (OTG) support. Most Android devices do support USB OTG. You will get a notification when you insert a flash drive with options that should include File Transfer. You can try the USB OTG Checker app to confirm support if you’re unsure. Apple’s earlier iPhones and iPads don’t support USB OTG, but you can install a companion app for drives, like SanDisk’s iXpand series.
Connectors: Most flash drives have USB-A connectors, but you can also get drives with USB-C, MicroUSB, and Lightning connectors. If you plan on using a flash drive with your smartphone and computer, snag one with both of the required types of connectors. You can also buy USB hubs with multiple USB ports or adapters, but pay close attention to the supported standard or it may limit your data transfer speeds. This Anker USB-A to USB-C adapter, for example, is USB 3.0.
Security: Remember that USB drives can cause security issues, particularly for businesses, and you should never plug in random drives you find lying around. If you plan to keep sensitive data on your flash drive, then consider biometric or passcode protection, and look into the level of encryption it offers. There are software services that offer encryption and allow you to password-protect your files on any USB flash drive.
I run read and write speed tests on every drive using USB Flash Benchmark and CrystalDiskMark. I also load HD and 4K movies onto each drive and play them on an LG OLED TV, make photo backups from phones and laptops, and copy files across supported devices. I have tested some drives as security camera backups, as NAS (network attached storage) in routers, for playing MP3 music files, and to load games and saves onto various retro consoles. Our top picks continue to be used regularly for file backups over months, so we can be confident that the performance does not degrade.
The key question regarding edge artificial intelligence (AI) is no longer about its vast business potential, but about where it can be most efficient and deliver faster, measurable results. Early uses across the manufacturing, retail and infrastructure sectors have to date focused on issues such as predictive machine maintenance, tailored, localised analytics in retail stores, and grid monitoring.
However, cost constraints, latency and data residency continue to require careful consideration by organisations looking to scale edge AI strategies.
“Early deployments should focus on narrow beachhead use cases where ethical, legal and security risks are limited – or clearly outweighed by the benefits,” observes Michaël Bikard, professor of strategy at the Insead business school. “That’s how new technologies have historically entered safety-sensitive domains.”
Edge AI is being used practically right now
Many global businesses have adopted edge AI in some capacity. However, most deployments remain relatively small and highly specialised, prioritising speed, reliability and energy efficiency over huge, datacentre-like models. They also depend significantly on human oversight and intervention.
Current models focus on minimising edge AI’s limitations, rather than ultra-sophisticated models. Most deployments are still hybrid, with humans handling most of the training and performance evaluation, while the model handles local inference.
Edge AI systems are optimised to recommend the best course of action, rather than make fully independent decisions. In highly regulated or safety-critical businesses, humans still have the final say.
Successful deployments highlight that edge AI is more about ensuring that reliable decisions can be taken closer to where the data is generated, rather than more intelligent technology itself.
What’s working: Predictive machine maintenance in manufacturing
Schneider Electric believes it has significantly advanced the industrial internet of things (IIoT) by using edge AI for real-time predictive maintenance on the factory floor, through local controllers, servers and devices. This is designed to improve operational efficiency while strengthening data security and decreasing latency as well.
The company uses edge AI systems to analyse factors such as real-time temperature, vibration and performance to predict machine issues before they occur, which helps decrease production stoppages.
It also employs edge AI for automated inspection and image-based barcode reading, which improves product quality. The Cognex AI-based technology can detect objects and shapes, allowing conveyor cameras to automatically reject flawed products.
Predictive maintenance succeeds when AI is embedded into operational workflows and decision processes, not deployed as a standalone analytics layer Himanshu Rai, director at IIM Indore
Schneider Electric also focuses on enhanced autonomous machine control through its EcoStruxure Automation Expert virtualised controller system. This connects shop floor IoT devices to edge controllers. The company also uses edge AI to grow yield by analysing variables in real time and reducing waste.
Automotive giant Renault has also deployed edge AI tools for predictive manufacturing maintenance. This is mainly achieved by supervising welding robots to ensure that welding defects and failure anticipations are flagged in real time, to minimise downtime.
Renault’s Industrial Metaverse uses edge AI heavily to analyse real-time data from 12,000 connected machines, which strengthens production lines. This is said to have helped Renault Group save €270m in 2023. Similarly, Renault’s autonomous control systems conduct visual inspections through edge AI, further freeing up operator time.
“Predictive maintenance has emerged as one of the most commercially successful AI use cases; however, technology alone is insufficient. Stalled or underperforming deployments may cite poor data integration, fragmented ownership, or constraints from legacy systems as root causes,” says Himanshu Rai, director at IIM Indore. “Predictive maintenance succeeds when AI is embedded into operational workflows and decision processes, not deployed as a standalone analytics layer.”
Real-time inventory tracking and decreasing food waste in retail
Fast fashion retailer H&M has partnered with Avassa in using edge AI to modernise in-store facilities, streamline operational efficiency and improve customer experience. Another focus is making sure applications keep working even when connectivity is down.
One of the biggest uses of edge AI is through RFID-enabled tracking, a highly accurate system allowing inventory to be tracked with real-time data. This helps staff find in-store items immediately, significantly cutting down on customer wait times.
Other in-store edge AI deployments include smart mirrors in fitting rooms, which connect to local networks and deliver product recommendations. They let buyers see which items are in stock in real time and ask for other sizes if required, without having to leave the fitting room, which considerably enhances the customer experience.
Customers can look for items using photos through the TensorFlow Lite edge AI system on the H&M app, too, further speeding up performance.
H&M is partnering with Honeywell to use edge AI to optimise lighting, heating and air-conditioning across 90 European stores as well. By gathering data from smart meters and sensors, the system improves real-time energy usage, decreasing costs and carbon footprint at the same time.
Similarly, grocery giant Tesco has leaned heavily into edge AI with a recent three-year partnership with Mistral AI to optimise its supply chain and reduce food wastage. One of the models employs dynamic expiry pricing. The system evaluates expiry dates and how fresh produce is, and automatically reduces prices for items expiring soon.
This has helped bring Tesco a step closer to its goal of reducing food waste, with wastage levels across UK operations down by 45% in 2025, compared with 2016/2017 levels. Another major deployment is real-time logistics and shipments tracking across more than 3,000 locations through solar-powered sensors. Tesco also saves 100,000 miles per week by using AI to search for the most efficient delivery routes.
Edge AI is used for product demand prediction as well, improving fresh produce shelf life, which decreases the risk of overstocking. This reduces the need for manual checking and improves inventory management across the board.
Self-checkout processes have been upgraded with edge AI too, with stores now including smart systems with cameras that use AI and computer vision to monitor real-time packaging behaviour and flag incorrectly scanned items.
Grid monitoring and maintenance in energy and infrastructure
Siemens Energy is successfully revolutionising legacy grid infrastructure to active, intelligent networks through edge AI, enabling them to automatically handle rising demand and fluctuating renewable energy levels.
The process includes AI systems such as substations, transformers and sensors, which allows predictive grid maintenance and real-time decision-making. Online sensor devices, such as the Sensformer advanced unit, keep tabs on high-voltage equipment and transformers.
Edge AI flags irregularities in temperature, vibration and torque through local data analysis. Operators can then maintain machines as per their current condition, rather than routine checks, avoiding expensive surprise downtimes.
Some sensors are virtual physics-informed neural networks (PINNs), used to virtually predict hotspots on things like transformer bushings without physical sensors.
New technologies gain traction not by being universally superior, but by outperforming the status quo in narrow contexts. In infrastructure, that often means environments requiring continuous, real-time monitoring at a scale or speed that humans or centralised systems cannot sustain Michaël Bikard, Insead
Another edge AI deployment, dynamic line rating (DLR), analyses line data factors like wind speed and temperature in real time and remits current transmission line capacities. Unlike potentially conservative static numbers, this unlocks 10% to 15% of additional capacity more than 90% of the time.
Siemens also implemented intelligent substations for a hybrid approach, which processes data locally and only shares relevant information to the cloud, improving bandwidth.
“New technologies gain traction not by being universally superior, but by outperforming the status quo in narrow contexts. In infrastructure, that often means environments requiring continuous, real-time monitoring at a scale or speed that humans or centralised systems cannot sustain,” Bikard observes.
Similarly, Ørsted uses edge AI systems for wind farm optimisation, by analysing data from thousands of turbine sensors, which optimises predictive maintenance too. It also monitors and analyses localised weather patterns like cloud cover and sun intensity, using the technology to better boost battery storage utilisation and solar energy production.
Edge AI failures
Despite several successful edge AI deployments in the past few years, there are some models which have failed – often very publicly. McDonald’s AI-driven voice ordering trial, deployed across about 100 drive-throughs, was one such case. The fast-food chain launched a three-year partnership with IBM for this project in 2021, which ended in 2024 following several bad reviews.
Viral, embarrassing social media videos posted by customers highlighted the system misunderstanding orders, sometimes resulting in hundreds of dollars’ worth of food being included. Mistakes such as adding bacon to ice cream were also common.
Other problems included issues with background noise, different human accents and dialects, and unusual local requests.
What drove success – and where models broke down
Successful edge AI deployments across Schneider Electric, Tesco and Siemens Energy, among others, had one common trait: they all focused on extremely narrow processes, within broader organisational structures. Launched in very controlled environments, they only scaled incrementally, after rigorous testing and iterations.
“Each stage generates learning, not just about performance, but about failure modes, governance and acceptable risk. Those lessons make it possible to move from tightly controlled settings to more complex environments,” Bikard points out.
These models also have a very clear ownership and accountability structure, with specific people being responsible for outcomes or issues. These include operators, supervisors, production line managers, shop managers or similar.
Data quality and domain expertise are more critical than algorithmic sophistication Florian Stahl, Mannheim Business School
Constant human supervision meant that any issues or downtime with the models could be immediately addressed with minimal repercussions. A hybrid approach between cloud and edge AI was consistently prioritised as well.
Successful deployments did not involve any absolutely critical processes either. Even in cases of predictive maintenance, both on factory floors and grids, their purpose was mainly to speed up and optimise the process, rather than take over completely.
On the other hand, one of the biggest pitfalls of the McDonald’s model was taking human oversight almost completely out of the loop and giving the system more autonomy than it was designed to handle as a pilot project. This led to serious mistakes, such as adding hundreds of dollars of extra food to orders going nearly unchecked, with customers having little recourse.
Another mistake was launching the initial trial across around 100 locations, instead of a few, well-monitored locations, and introducing far too much data at once through various human accents.
The model in question was also ill-suited to handling open-ended inputs, the kind which should be expected in a fast food restaurant, which sees a high volume of personalised requests.
Finally, McDonald’s being a well-recognised global brand also meant the company had a very small margin of error to launch new features before being potentially criticised by clients, thus requiring far more testing before launch.
“One key lesson is that data quality and domain expertise are more critical than algorithmic sophistication,” observes Florian Stahl, chair of quantitative marketing and consumer analytics at Mannheim Business School. Many early failures can be traced to poorly labelled data, sensor drift, or insufficient understanding of underlying physical processes.
What’s next?
As successful edge AI use cases increase, businesses are likely to move away from isolated experiments to more widespread deployments, through cameras, sensors, robots and other machines.
This may decrease cloud reliance while speeding up decision-making at the edge. However, the fundamental principle driving successful deployments will remain the same.
The most successful edge AI models will still be those that address highly specific tasks and scale incrementally, while having clear oversight, ownership and accountability structures, even if the number of endpoints grows.
“Framing adoption as a human-versus-AI contest misses where the real opportunities lie. What matters instead is identifying situations where existing solutions are clearly insufficient,” Bikard concludes.