While Aventon is known first and foremost as an ebike brand, the company started by making fixies in 2013. That gives it some bona fides when it comes to making enjoyable rides for experienced cyclists. (In addition to the Current ADV, there’s also a higher-end model, the Current EXP, with a more expensive carbon frame and better components.) Since its first venture into e-MTBs with the Ramblas in 2024, the company has continued to develop very nicely specced electric mountain bikes for the price.
The designers behind the newest iterations did a masterful job. The Current ADV looks 100 percent the part of contemporary mountain bike. With its 6061 aluminum frame, SRAM Eagle groupset, tubeless-ready Maxxis Minion tires wrapping a pair of double-walled 29-inch wheels, a 170-mm X Fusion Manic dropper post, a Rockshox Psylo Gold front suspension that boasts 150 mm of travel, and a Rockshox Deluxe Select+, it’d be easy to confuse the Current ADV for a traditional analog mountain bike.
Photograph: Michael Venutolo-Mantovani
It’s worth noting that while the motor is proprietary to Aventon, the components are not. It might be difficult to get your local bike shop to look at the battery and motor, but assuming those are fine, it won’t be hard to swap anything else out should you need to repair it.
Despite its design and ride feel, all of which can make you easily forget you’re riding electric, the Current ADV is a class 1 e-MTB (which can be toggled to a class 3 via the brand’s app), and one that gives hours and hours of riding on a single charge.
The 800-watt-hour battery is tucked neatly into the bike’s relatively small downtube, giving a claimed range of up to 105 miles. Of course, I didn’t get nearly that, as I was constantly switching through any of the Current ADV’s five power modes (Auto, Eco, Trail, Turbo, and a new, 30-second Boost Mode for extra torque on big hills). Still, the longest day I spent in the bike’s super-comfy Selle Royal SRX saddle was about three hours. In that time, the battery dropped only about 20 percent.
Eyes Up
The biggest flaw I found in the Current is small and seemingly simple, but it nonetheless had a major impact on my rides. That is the fact that, when clicking through power settings, the bike beeps, and all those beeps sound the same.
When I’m mountain biking (and probably when you’re mountain biking, too), the last thing I want to do is to take my eyes off the trail. Having those beeps be the exact same tone meant I instinctively kept looking down at the top-tube-mounted display to see which mode I was in.
Medical experts I spoke with balked at the idea of uploading their own health data for an AI model, like Muse Spark, to analyze. “These chatbots now allow you to connect your own biometric data, put in your own lab information, and honestly, that makes me pretty nervous,” says Gauri Agarwal, a doctor of medicine and associate professor at the University of Miami. “I certainly wouldn’t connect my own health information to a service that I’m not fully able to control, understand where that information is being stored, or how it’s being utilized.” She recommends people stick to lower-stakes, more general interactions, like prepping questions for your doctor.
It can be tempting to rely on AI-assisted help for interpreting health, especially with the skyrocketing cost of medical treatments and overall inaccessibility of regular doctor visits for some people navigating the US health care system.
“You will be forgiven for going online and delegating what used to be a powerful, important personal relationship between a doctor and a patient—to a robot,” says Kenneth Goodman, founder of the University of Miami’s Institute for Bioethics and Health Policy. “I think running into that without due diligence is dangerous.” Before he considers using any of these tools, Goodman wants to see research proving that they are beneficial for your health, not just better at answering health questions than some competitor chatbot.
When I asked Meta AI for more information about how it would interpret my health information, if I provided any, the chatbot said it was not trying to replace my physician; the outputs were for educational purposes. “Think of me as a med school professor, not your doctor,” said Meta AI. That’s still a lofty claim.
The bot said the best way to get an interpretation of my health data was just to “dump the raw data,” like clinical lab reports, and tell it what my goals were. Meta AI would then create charts, summarize the info, and give a “referral nudge if needed.” In other chats I conducted with Meta AI, the bot prompted me to strip personal details before uploading lab results, but these caveats were not present in every test conversation.
“People have long used the internet to ask health questions,” a Meta spokesperson tells WIRED. “With Meta AI and Muse Spark, people are in control of what information to share, and our terms make clear they should only share what they’re comfortable with.”
In addition to privacy concerns, experts I spoke with expressed trepidation about how these AI tools can be sycophantic and influenced by how users ask questions. “A model might take the information that’s provided more as a given without questioning the assumptions that the patient inherently made when asking the question,” says Agrawal.
When I asked how to lose weight and nudged the bot towards extreme answers, Meta AI helped in ways that could be catastrophic for someone with anorexia. As I asked about the benefits of intermittent fasting, I told Meta AI that I wanted to fast five days every week. Despite flagging that this was not for most people and putting me at risk for eating disorders, Meta AI crafted a meal plan for me where I would only eat around 500 calories most days, which would leave me malnourished.
In January 2026, 45 UK MPs submitted an Early Day Motion entitled “UK digital sovereignty strategy”. The motion pointed to the dependency of government services, democratic functions and critical infrastructure on a small number of digital providers.
Those providers are US-based hyperscaler cloud providers AWS, Azure and Google Cloud, also known as the Big Three, who between them provide cloud services to more than 90% of UK public sector organisations.
Meanwhile, in October 2025, the European People’s Party group in the European Parliament adopted a position paper calling for, “a permanent EU Tech Forum to guide digital strategy [and] build sovereign European digital infrastructure for cloud, AI and data – free from foreign control”.
This came ahead of a summit on European digital sovereignty that took place in November in Berlin and gathered more than 900 policymakers, industry leaders, investors, researchers and civil society representatives from 27 EU member states.
At the event, German chancellor Friedrich Merz said: “For Europe, digital sovereignty means the ability to shape technology across the entire value chain in line with European interests and needs. We seek competition on equal terms.”
These are just some examples of initiatives aimed at wresting back some control and data sovereignty in the UK and Europe against a backdrop of overwhelming dominance by US hyperscalers of public and private sector infrastructure.
In this article, we look at European lawmakers’ attempts to drive towards greater digital sovereignty, how that overlaps with opposition to anti-competitive practices in the market, and why governments need to think about encouraging home grown tech – or else risk losing it.
Digital sovereignty: Taking back control
The UK digital sovereignty strategy Early Day Motion was sponsored by MPs from parties that included the Greens, Labour, Liberal Democrats, Plaid Cymru and numerous independents.
The first part of the motion read: “That this house notes that government services, democratic functions and critical infrastructure increasingly depend on a small number of external digital suppliers; further notes that excessive concentration and inadequate exit or substitution planning expose the public sector to risks including service withdrawal, sanctions, commercial failure, geopolitical disruption and unilateral changes in service terms.”
It went on to say it believed “long-term resilience, continuity of public services and value for money require the government to retain effective control over digital systems it funds or relies on” and to “support UK technology firms and SMEs, and increase the proportion of public digital expenditure retained in the UK economy”.
It capped this with a call to, “publish a comprehensive UK digital sovereignty strategy with binding effect across central government, arm’s-length bodies and the wider public sector”.
A lack of digital sovereignty? The UK public sector example
In the financial year 2023/2024, 95% of central and local public sector organisations in the UK spent budget on hyperscale cloud services. When it comes to spending on services such as software as a service (SaaS) that rely on hyperscaler cloud, that percentage expands to 99%.
This is taken from data gathered by Tussell and Computer Weekly that covers more than 1,100 central and local government organisations that range from ministries to councils and a wide variety of other agencies.
Out of 22 government departments in the data, 21 spent budget on hyperscale cloud in some form in that year, and 13 spent 50% or more of their tech budget on hyperscale cloud directly or via cloud resellers.
The top five public sector spenders on hyperscale cloud were: Ministry of Defence (£1.09bn), HM Revenue & Customs (£1.01bn), the Home Office (£775m), Department for Work and Pensions (£622m), and NHS England (£442m).
Digital sovereignty: UK government lacks a definition
Meanwhile, at ministry level – namely the Department for Science, Innovation and Technology (DSIT) – the UK lacks a clear definition of data sovereignty from which to work.
It told Computer Weekly in a request for comment in February 2026: “This is a complex and evolving policy area, rather than a specific project. It requires engaging with departments across government – a process which is ongoing.”
The DSIT could not give a timescale for the process, but said: “Work continues across government to ensure a consistent approach, and we will have more to say in due course. There is no single, globally agreed definition of digital sovereignty. International approaches vary and are shaped by domestic policy objectives.
“However, UK public sector technology buyers already operate inside a strong framework of safeguards, for example: data protection law, UK security standards, the Cloud First policy and established commercial rules. These combine to help effectively protect public services.”
Liberal Democrat spokesperson for science, innovation and technology Tim Clement-Jones believes this lack of definition serves a purpose – namely, that the DSIT doesn’t have to grapple effectively with regulation around the issue.
“They’re very good at lacking definitions, because it means that they don’t have to regulate them. That’s the whole idea,” he says. “When we did our AI and defence paper, they didn’t have a definition of a lethal autonomous weapon. And we thought, ‘This is peculiar. These things are dangerous; there’s high risk’, but they couldn’t come up with one. And they said, ‘NATO doesn’t have a definition either’.”
Where data sovereignty meets anti-trust
Nicky Stewart, senior adviser with the Open Cloud Coalition, believes UK public sector procurement is held in a stranglehold by AWS and Microsoft, and that this is anti-competitive and to the detriment of UK companies. The cost to those organisations that procure cloud services, and by extension the UK taxpayer, is up to £500m per year, she says.
She believes UK public sector procurement has moved from a “public cloud first” policy to one of “hyperscaler cloud first” and that direct awards resulting from this have tended to lock public sector bodies into the US giants.
Stewart says: “They came up with the G-Cloud framework, where essentially cloud providers who aspired to provide to government could showcase their wares. It operated as a catalogue. The buyer went in with a list of their requirements and it would spit out a list of providers and their services. They put that down to a short list and then they directly awarded it. There was no competitive process, no negotiation around prices, nothing.”
Initially, she says, that involved relatively small direct award contracts: “But when they started moving to hyperscale public cloud, the size of those direct awards got bigger and bigger. Some of those contracts were hundreds of millions in direct award even though the Crown Commercial Services’ own guidance says they should be for low value or urgent transactions.”
Some contracts were hundreds of millions in direct award even though the Crown Commercial Services’ own guidance says they should be for low value or urgent transactions Nicky Stewart, Open Cloud Coalition
Then, says Stewart, came “committed spend” agreements – such as with AWS for multiple millions of pounds – and into which government departments became even more tightly locked. Meanwhile, she says, UK suppliers are shut out by high entry requirements to frameworks such as G-Cloud.
“The public sector has got itself locked in into the two dominant cloud providers,” says Stewart. “And once you’re locked in, there’s a whole chain of things you need to think about. It’s not just a case of ‘I want to switch cloud providers’ or ‘I want to diversify my cloud providers’. You need to think about the skills to switch or diversify and the uncertainty about how much it will cost.”
The CMA is set to decide whether to apply strategic market status (SMS) in relation to AWS and Microsoft’s activities in cloud services. SMS would allow the CMA to “impose targeted and bespoke interventions to address … concerns … identified”.
It is yet to be seen what the effect of those measures will be.
European responses to risks around data sovereignty
Europe has been a little more forward in formulating responses to concerns over data sovereignty, and in particular with regard to the overwhelming market dominance of the US hyperscalers. There have been initiatives to build some degree of home grown cloud tech. Europe is a little less dependent on US hyperscalers than the UK, so it’s possible it has made a dent.
Initiatives include:
The European Gaia-X project to develop a secure European data infrastructure, although this appears largely stalled.
France’s SecNumCloud, a high-level security certification for cloud service providers aimed at provision of trusted, sovereign hosting by protecting against non-EU legal, technical and cyber security risks.
France’s Cloud de confiance, a government-backed initiative to provide secure, sovereign cloud computing services that protect sensitive data from foreign surveillance.
The industrial-focussed IPCEI-CIS, in which around 100 companies and institutes from 12 EU countries are cooperating on developing new data and cloud solutions.
What do campaigners call for: Axel’s axis in Europe
Axel Voss MEP of the European People’s Party has been a vocal advocate of building European digital sovereignty. He wants to cut red tape and create a preferential environment for European suppliers. Voss believes European sovereign digital capability means strengthening European suppliers and making it easier for European public and private sector organisations to use them.
He says: “It’s not autarky or protectionism, it’s Europe being able to take independent decisions about the parameters of digital technologies, backed by real European options in cloud, AI and data; open standards and interoperability; and procurement that builds a resilient European supplier base.
“Practically, that means pilots that combine European compute and data spaces, ‘EU-by-default’ tools in institutions, and funding and scale mechanisms to make European providers competitive.”
For Voss, a key matter is also to remove obstacles to European digital innovation: “Our main obstacles are fragmentation and slow, bureaucratic decision-making. That’s why I push measures like cutting real red tape, strengthening investment/VC and strategic capabilities (cloud/AI/edge/cyber/chips), and using procurement and open standards to break lock-ins.”
Grow native capability or die?
Nicky Stewart of the Open Cloud Coalition wants to lower barriers to UK cloud providers, after years of them being sidelined while UK public sector procurement resulted in the hyperscalers becoming entrenched.
“There are more UK cloud providers than I can count on my hands and feet,” she says. “Some of them can operate at scale – not necessarily the same scale as the hyperscale cloud providers, but they have different offerings. There’s always going to be a place for hyperscale and there are certain workloads that are suited to that sort of scale.
“But there are other workloads with different requirements. Maybe they’re more stable, for example, not peaking and spiking. Or they may have really high security requirements, or sovereign solutions, or can offer better value for money, or much more personal customer service.
“The point here is that if the UK public sector government doesn’t give the right signals to its own cloud hosting industry, how on earth does it expect to grow any native capability?”
OpenAI has paused plans for its Stargate UK investment, which was to take place in concert with artificial intelligence (AI) datacentre builder Nscale and in the government’s AI growth zones.
The Microsoft-backed company has cited concerns about rising energy costs as well as the regulatory environment in the UK, particularly in copyright.
Affected locations – should OpenAI’s “pause” become permanent – are in the government’s north eastern AI growth zone centred on north Tyneside and Blyth in Northumberland.
According to an Nscale announcement in September 2025, Nscale, OpenAI and Nvidia agreed to establish Stargate UK as an infrastructure platform designed to deploy OpenAI’s technology in the UK.
It said at the time that OpenAI would “explore offtake of up to 8,000 Nvidia GPUs [graphics processing units] in Q1 2026 with the potential to scale to 31,000 Nvidia GPUs over time”.
It said Stargate UK would be based across a number of sites in the UK, but only named Cobalt Park, which is currently home to about 35MW of datacentre capacity.
Expansion of Cobalt Park has been touted, but most of this appears to centre on the now-shelved OpenAI/Nscale plans, and there are currently no planning applications lodged or construction underway for datacentre capacity at the site.
Calculated pause?
That much of OpenAI’s plans have been hedged with conditional wording and lack of concrete progress is not lost on some industry watchers. Bill McCluggage – director of IT strategy and policy in the Cabinet Office and deputy government CIO from 2009 to 2012 – said OpenAI’s decision to pause its proposed Stargate datacentre in the north east looks less a sudden setback and more a calculated pause.
“The stated concern about uncertainty around UK copyright rules and high energy costs are real enough, particularly given the government’s fickle approach to copyright regulation and how power-hungry these facilities are,” he said. “But they are unlikely to be the whole story.
“With an IPO on the horizon, it is hardly surprising that OpenAI is tightening its risk profile, especially against a backdrop of rising infrastructure costs, supply chain fragility in advanced chips, and questions about the pace of AI commercial returns. Reports of delays and disagreements in similar US projects only reinforce that caution.”
McCluggage also suggested the move may be a means to apply pressure for clearer government support and policy certainty.
“In that light, the pause feels less like retreat and more like prudent positioning before committing to a multibillion-pound bet,” he said.
OpenAI has also cited concerns about “regulation”, in particular the UK government stance on copyright with regard to AI training. Here, the government had originally been set to allow AI training to be exempt from copyright, but then faced a backlash from creative sectors fronted by Elton John and Dua Lipa. In late March, the government adopted a holding position that barred open access to copyrighted works for AI training.
Liberal Democrat peer Lord Clement-Jones said: “This is disappointing news, but citing regulation as a reason for not proceeding with their investment in the UK is laughable given the European regulatory landscape and similar copyright issues. Energy costs and other wider economic risks may well have deterred OpenAI alongside potentially overstretched global investment plans.”
Call for clarity
Conservative peer Chris Holmes called for clarity around the issue, and the need for a UK AI Bill.
“What we all need when it comes to AI is clarity, consistency and a coherent approach,” he said. “From the government right now, this is not quite the case. By yet again ‘ducking’ the copyright issue last month they leave everyone in limbo, with a sub-optimal non-solution for all concerned.
“If the government really wants us to optimise the AI opportunity, they must bring forward a cross sector, cross economy AI Bill that brings clarity, consistency and coherence of approach which will benefit datacentre build, startup and scaleups, and a real sense of UK sovereign AI,” said Homes. “Sadly, it seems in the upcoming King’s Speech on 13 May, they have no intention of taking this clear positive action.”
OpenAI and the UK government signed a memorandum of understanding in July 2025 aimed at strategic partnership to deliver AI-driven growth.
At the time, OpenAI cited its use by big UK names that included the NHS, NatWest, Oxford University and Virgin Atlantic.
OpenAI was careful to label commitments as “non-binding”, but these included exploring use of AI in the public sector, developing UK sovereign AI capability and security research.
At the same time, OpenAI said it would increase its footprint in the UK from the current 100 staff.