Connect with us

Tech

Privacy will be under unprecedented attack in 2026 | Computer Weekly

Published

on

Privacy will be under unprecedented attack in 2026 | Computer Weekly


The privacy of electronic communications will face new risks in 2026, as the UK and other governments push for greater capabilities to harvest and analyse more data on private citizens, and to make it harder to protect communications with end-to-end encryption.

Over the next 12 months, we can expect more pressure from the UK and Europe to restrict the unencumbered use of end-to-end encrypted email and messaging services such as Signal, WhatsApp and many others.

In the 1990s, the US government tried and ultimately failed to persuade telecommunications companies to install a device known as the Clipper chip to provide the US National Security Agency (NSA) with “backdoor” access to voice and data communications.

The Crypto wars of 2026 are more subtle, with controls and restrictions on encryption pushed by governments, law enforcement agencies and intelligence services as a means of detecting child sexual abuse and terrorist material being promulgated through encrypted email and messaging systems.

The answer governments are settling on is to encourage the use of scanning technology in a voluntary or compulsory way, to identify problematic content before it is encrypted.

Cryptographers and computer scientists have repeatedly warned that such plans will create security vulnerabilities that will leave the public less safe than before.

Chat Control and client-side scanning

The European Parliament and Council are expected to adopt the controversial Child Sexual Abuse Regulation (CSAR) in spring 2026. In its current form, it proposes that messaging platforms voluntarily scan private communications for offending content, combined with proposals for age verification to check the age of users.

Known by the nickname Chat Control, its critics – such as former MEP Patrick Breyer, a jurist and digital rights activist – claim the regulation will open the doors to “warrantless and error-prone” mass surveillance of European Union (EU) citizens by US technology companies. The algorithms, say critics, are notoriously unreliable, potentially exposing tens of thousands of legal private chats to police scrutiny.

Chat Control will also put pressure on technology companies to introduce age checks to help them “reliably identify minors”, a move that would likely require every citizen to upload an ID or take a face scan to open an account on an email or messaging service. According to Breyer, this creates a de facto ban on anonymous communication, putting whistleblowers, journalists and political activists who rely on anonymity at risk.

Online Safety Act

In the UK, there remain concerns about provisions in the Online Safety Act that, if implemented by regulator Ofcom, would require technology companies to scan encrypted messages and emails.

These powers attracted widespread criticism from technology companies as the bill passed into law, with Signal warning it would pull its encrypted messaging service from the UK if it was forced to introduce what it called a “backdoor”.

Commentators think there is little current appetite for Ofcom to mandate client-side scanning for private communications, given the level of opposition.

But it may require providers of public and semi-public services, such as cloud storage, to introduce scanning services to detect illegal content.

“I think they may be waiting to see what happens in Europe with the Chat Control proposal, because it’s quite hard for the UK to go alone,” James Baker, campaigner at the Open Rights Group, told Computer Weekly.

Perceptual hash matching

One of the items on Ofcom’s agenda is a form of scanning, known as perceptual hash matching, which uses an algorithm to decide whether images or videos are similar to known child abuse or terrorism images.

A consultation document from Ofcom proposes requiring tech platforms that allow users to upload or share photographs, images and videos – including file storage and sharing services, and social media companies – to introduce the technology for detecting terrorism and abuse-related material.

“We also think some services should go further – assessing the role that automated tools can play in detecting a wider range of content, including child abuse material, fraudulent content, and content promoting suicide and self-harm, and implementing new technology where it is available and effective,” it says in its consultation document.

But there are questions about the accuracy of perceptual hash matching, and the risk that its use may lead to people wrongly being barred from online services for alleged crimes they have not committed.

Critics point out that perceptual hash matching used to be called “fuzzy matching” – and for good reason. Although its new name, “perceptual hash matching”, gives the impression of precision and predictability, in reality, it produces false positives and negatives.

Hundreds of people have been blocked from Instagram, owned by Meta, after being wrongly accused of breaching Meta’s policies on child sexual exploitation and abuse. The company’s actions took a huge emotional toll on the people affected, and in some cases led to people losing their online businesses, the BBC reported in October 2025.

Alec Muffett, security expert and former Facebook engineer, told Computer Weekly that Ofcom’s proposals display “a horrifying lack of safety by design” and said its proposal to force companies to adopt the technology without mitigating the potential risks is “derelict”.

“Perceptual hashing is just a fancy name for what we used to call ‘fuzzy matching’ with ‘digital fingerprints’, and even if we ignore the problem of false positives, we are left with the risk of creating an enormous cloud surveillance engine by logging all queries for even benign digital fingerprints,” he said.

Encryption apps viewed as national security risk

There are signs of increasing government discomfort with encrypted communications. In December 2025, the Independent Reviewer of State Threats Legislation delivered a stark warning that developers of encryption technology could be subject to police stops, detention and questioning, and the seizure of their electronic devices under national security laws.

According to Jonathan Hall KC, the developer of an app whose selling point is that it offers end-to-end encryption, could be considered to be unwittingly engaged in “hostile activity” under Section 3 of the Counterterrorism and Border Security Act 2019.

“It is a reasonable assumption that [the development of the app] would be in the interests of a foreign state even if the foreign state has never contemplated this potential advantage,” he wrote.

Digital ID all over again

The UK’s proposals for a mandatory digital ID scheme look set to be another battleground for privacy in 2026. The government says the scheme will help to crack down on illegal immigration by introducing mandatory “right to work” checks by the end of the Parliamentary term.

MPs were scathing when the bill was introduced in Parliament. “The real fear here is that we will be building an infrastructure that can follow us, link our most sensitive information and expand state control over all our lives,” said Rebecca Long-Bailey during the debate. Others raised concerns about the cyber security risks of storing details of the population on a central government database.

Gus Hosein, executive director of campaign group Privacy International, notes that the Home Office is repeating the same arguments originally put forward in 2023 when Tony Blair attempted to introduce a national identity card. The scheme was scrapped by the Conservative and Liberal Democrat coalition in 2010. “It’s just the same boring rhetoric: ‘It’s going to stop ID fraud, it’s going to stop terrorism, it’s going to stop migration problems,’” he said. “Do we really have to go through the whole process of debunking this again?”

Hosein said the prospects of the Home Office coming up with a workable system before the next election are low. The political climate is different this time. Nearly three million people have signed a Parliamentary petition calling for the idea to be scrapped. “If they try and do the classic thing, which is to try and build something grand and momentous, it will take forever,” he said. “I would not mind an ID system that actually worked, I just don’t want the Home Office within 10,000 miles of it.”

When combined with facial recognition, digital ID raises further privacy issues. Campaign groups are expected to bring a legal challenge in 2026 after Freedom of Information Act requests revealed that the government covertly allowed police forces to search 150 million UK passport and immigration database photos for matches of images captured by facial recognition technology.

Big Brother Watch and Privacy International have issued legal letters before action to the Home Office and the Metropolitan Police. They argue that there is no clear legal basis for the practice and that the Home Office has kept the public and Parliament in the dark.

“There is a risk when you roll out digital facial recognition cameras that the images used for digital ID will be used to track you around town centres,” said the Open Rights Group’s Baker.

Apple backdoors and technical capability notices

This year will see further legal challenges at the Investigatory Powers Tribunal against the Home Office’s secret order issued against Apple, requiring it to facilitate access for law enforcement and intelligence agencies to encrypted data stored by Apple’s customers on iCloud.

Scheduled for the spring, the case brought by Privacy International and Liberty will challenge the lawfulness of the Home Office using a technical capability notice (TCN) to require Apple to disclose the encrypted data of users of its Advanced Data Protection (ADP) service worldwide.

Apple is expected to issue a new legal challenge after the UK government abandoned its original wide-ranging TCN and replaced it with an order focused on providing access only to ADP users in the UK, ending Apple’s legal challenge, at least for now.

The case has the potential to turn into a mammoth battle, reaching the Supreme Court and the European Court of Human Rights.

Surveillance of journalists

This year will also see further legal challenges that will test the boundaries between state intrusion and the professional privileges accorded to lawyers and journalists to protect the confidentiality of their clients or journalistic information.

The Investigatory Powers Tribunal is due to decide on a case brought by the BBC and former BBC journalist Vincent Kearney against the Police Service of Northern Ireland and the Security Service, MI5.

The Security Service broke with the conventions of Neither Disclose Nor Deny (NCND) to acknowledge to the tribunal that it had unlawfully obtained phone communications data from Kearney in 2006 and 2009, while he was working at the BBC, in an attempt to identify his confidential sources.

Although MI5 followed the Communications Data code of practice at the time, the code did not meet the strict legal tests for accessing journalistic material, which is protected under the European Convention of Human Rights.

In a judgment just before Christmas, the IPT rejected arguments that MI5 should disclose further details of surveillance operations against Kearney and other BBC journalists, including operations that had proper legal approval. The IPT will decide what remedy is due in 2026, and whether Kearney and the BBC should receive compensation.

Another legal case will test the boundaries between police surveillance and the legal protection given to lawyers to protect the confidentiality of discussions with their clients when subject to police stops.

Fahad Ansari, a lawyer who acted for Hamas in an attempt to overturn its proscription as a terrorist organisation in the UK, had his mobile phone seized by police after he was detained under Schedule 7 of the Terrorism Act 2000 at a ferry port, after returning from a family holiday.

The case is believed to be the first targeted use of Schedule 7 powers – which allow police to stop and question people and seize their electronic devices without the need for suspicion – against a practising solicitor.

Ansari is seeking a judicial review to challenge the right of police to examine the contents of his phone, which contains confidential and legally privileged material from his clients, accumulated over 15 years.

The legal fallout from EncroChat and SkyECC

The legal fallout from an international police operation to hack encrypted phone network Sky ECC and EncroChat more than five years ago will continue.

French police led operations to harvest tens of millions of encrypted messages used as evidence of criminality to bring prosecutions against drug gangs across Europe and the UK.

Defence lawyers and forensic experts have raised questions about the reliability of the evidence supplied by the French to the UK and EU states through Europol.

France has declared the hacking operation against EncroChat and Sky ECC a state secret and refused to allow members of the French Gendarmerie to give evidence on how the intercepted data was obtained.

This has meant individuals facing charges outside France based on evidence from EncroChat or SkyECC have no legal recourse to challenge the legality of the French hacking operation.

Courts in the EU are obliged to accept the evidence provided by France under the “mutual recognition” principal that applies when one EU state supplies evidence to another under a European Investigation Order.

At the same time, people have been denied the right to challenge the evidence against them in the French courts, leaving people charged with offences based on the hacked phone data without legal recourse to appeal in any jurisdiction.

Decisions by the European Court of Justice and the European Court of Human Rights, expected this year, could end that anomaly.

In one case, the French Supreme Court – La Cour de cassation – has asked the Court of Justice to decide whether France’s refusal to allow non-French citizens to challenge the lawfulness of the French hacking operations in France contravenes EU law. According to La Cour de cassation, the decision is likely to have “significant consequences” for legal proceedings based on intercepted evidence in the EU.

In the second case, the European Court of Human Rights is expected to decide on a complaint from a German citizen, Murat Silgar, who was jailed for drug offences on the basis of EncroChat evidence.

Silgar argues that the German courts had used illegally obtained communications data and that technical details of the French retrieval of EncroChat data were not shared with him, in breach of the European Convention of Human Rights, which protects the right to a fair trial, and the right to private correspondence.

Justus Reisginer, a member of a coalition of defence lawyers known as the Joint Defence Team, told Computer Weekly the cases would address “a fundamental principle” in cross-border and digital investigations. “The law of the European Union requires that people have an effective remedy,” he said.

These are just a few of the battle lines between technology and privacy that will play out in 2026. For governments, the promise of a “technical fix” to deal with wider societal problems, such as child abuse and terrorism offences, is attractive. But history has shown that “technical fixes” rarely work, and often have unforeseen consequences.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

This AI Agent Is Designed to Not Go Rogue

Published

on

This AI Agent Is Designed to Not Go Rogue


AI agents like OpenClaw have recently exploded in popularity precisely because they can take the reins of your digital life. Whether you want a personalized morning news digest, a proxy that can fight with your cable company’s customer service, or a to-do list auditor that will do some tasks for you and prod you to resolve the rest, agentic assistants are built to access your digital accounts and carry out your commands. This is helpful—but has also caused a lot of chaos. The bots are out there mass-deleting emails they’ve been instructed to preserve, writing hit pieces over perceived snubs, and launching phishing attacks against their owners.

Watching the pandemonium unfold in recent weeks, longtime security engineer and researcher Niels Provos decided to try something new. Today he is launching an open source, secure AI assistant called IronCurtain designed to add a critical layer of control. Instead of the agent directly interacting with the user’s systems and accounts, it runs in an isolated virtual machine. And its ability to take any action is mediated by a policy—you could even think of it as a constitution—that the owner writes to govern the system. Crucially, IronCurtain is also designed to receive these overarching policies in plain English and then runs them through a multistep process that uses a large language model (LLM) to convert the natural language into an enforceable security policy.

“Services like OpenClaw are at peak hype right now, but my hope is that there’s an opportunity to say, ‘Well, this is probably not how we want to do it,’” Provos says. “Instead, let’s develop something that still gives you very high utility, but is not going to go into these completely uncharted, sometimes destructive, paths.”

IronCurtain’s ability to take intuitive, straightforward statements and turn them into enforceable, deterministic—or predictable—red lines is vital, Provos says, because LLMs are famously “stochastic” and probabilistic. In other words, they don’t necessarily always generate the same content or give the same information in response to the same prompt. This creates challenges for AI guardrails, because AI systems can evolve over time such that they revise how they interpret a control or constraint mechanism, which can result in rogue activity.

An IronCurtain policy, Provos says, could be as simple as: “The agent may read all my email. It may send email to people in my contacts without asking. For anyone else, ask me first. Never delete anything permanently.”

IronCurtain takes these instructions, turns them into an enforceable policy, and then mediates between the assistant agent in the virtual machine and what’s known as the model context protocol server that gives LLMs access to data and other digital services to carry out tasks. Being able to constrain an agent this way adds an important component of access control that web platforms like email providers don’t currently offer because they weren’t built for the scenario where both a human owner and AI agent bots are all using one account.

Provos notes that IronCurtain is designed to refine and improve each user’s “constitution” over time as the system encounters edge cases and asks for human input about how to proceed. The system, which is model-independent and can be used with any LLM, is also designed to maintain an audit log of all policy decisions over time.

IronCurtain is a research prototype, not a consumer product, and Provos hopes that people will contribute to the project to explore and help it evolve. Dino Dai Zovi, a well-known cybersecurity researcher who has been experimenting with early versions of IronCurtain, says that the conceptual approach the project takes aligns with his own intuition about how agentic AI needs to be constrained.



Source link

Continue Reading

Tech

OpenAI Announces Major Expansion of London Office

Published

on

OpenAI Announces Major Expansion of London Office


OpenAI has announced plans to turn its London office into its largest research hub outside of the United States.

The company—which established a UK office in 2023—says it will expand its London-based research team, scooping up talent emerging from leading British universities. It has not indicated how many researchers it will hire.

“The UK brings together world-class talent and leading scientific institutions and universities, making it an ideal place to deliver the important research which will ensure our AI is safe, useful, and benefits everyone,” said Mark Chen, chief research officer at OpenAI, in a statement.

The plans bring OpenAI into direct competition for top research talent with Google DeepMind, the AI lab run by British researcher Demis Hassabis, which is headquartered in London. DeepMind has long-running partnerships with Oxford University and the University of Cambridge, where it sponsors professorships, funds research, and works alongside researchers.

At the latest careers fair at Oxford University, the floor was packed with undergraduates looking for technical roles and recruiters hiring for AI-related positions. “The demand and supply is increasing on both sides, even within a year,” says Jonathan Black, director of the careers service at Oxford University. “To have something like this turn up is a really positive sign.”

OpenAI’s expansion in London could have a sort-of flywheel effect, whereby the researchers it hires early in their careers go on to start new labs in the UK, says Tom Wilson, partner at venture capital firm Seedcamp. “We’ve seen many examples over the years,” he says. “That’s where these kinds of announcements can have even more impact than the initial hires … the second-order effects can be great.”

OpenAI’s team in London will continue to contribute to products like Codex and GPT-5.2, the company says, but will now “own” certain aspects of model development relating to safety, reliability, and performance evaluation.

In a statement, the UK’s science and technology secretary, Liz Kendall, described the announcement as “a huge vote of confidence in the UK’s world-leading position at the cutting edge of AI research.”

The announcement coincides with a push in the UK to scale the nation’s data center and power infrastructure to meet the voracious demand for compute among AI companies, including OpenAI.



Source link

Continue Reading

Tech

Stay Warm in the Lodge or Half-Pipe with the Best Ski Clothes

Published

on

Stay Warm in the Lodge or Half-Pipe with the Best Ski Clothes


Honorable Mentions

During the winter, a whole WIRED crew tests ski clothes almost constantly. Here are a few other items that we like.

Courtesy of REI

Hestra Fall Line 3-Finger Gloves for $190: I’ve long admired Hestra gloves from across the lift line, impressed by the Swedish company’s elegant stitchwork and thoughtful design touches. This was the year I finally got to try a pair for myself, and the Fall Line are exactly what they look like. There are six sizes available so you can get the perfect fit in this glove. The cowhide is buttery smooth and has already broken in a bit with five days’ use. The wrist strap means you never have to fret about dropping your glove from the lift when checking your phone, and they’re very warm without making me sweat. If you do sweat, the lining is removable so you can wash it without damaging the leather. —Martin Cizmar

Image may contain Clothing Glove Baseball Baseball Glove and Sport

Courtesy of Crab Grab

Crab Grab Snuggler Mitts for $89: These mini sleeping bags for your fingers are packed full of Primaloft insulation and benefitting from a sherpa fleece lining, they are toasty warm, and with a 15K membrane, impressively waterproof too. All-season mittens with durable construction for under $100? Yes Please!

Image may contain Clothing Long Sleeve Sleeve Knitwear Sweater and Coat

Courtesy of Mons Royale

Mons Royale Yotei Merino Classic Long Sleeve for $98: As I type this, I’m nowhere near a mountain, but I’m still wearing the Mons Royale Yotei long sleeve top. It is ridiculously comfortable, made from 190-gsm-weight, 100 percent merino wool, and has a mercifully relaxed cut, so I remain warm, but don’t feel like a sausage. On the mountain however, the merino wool works its magic, wicking away sweat—especially on a hike up to some fresh powder—and keeping me comfortable. Paired with a shell and the Patagonia R1 Thermal Hoodie, I’m warm enough during a bitter arctic blast.

Person wearing an orange Seniq Powder Puff Down Jacket and bib while holding an orange snowboard upright with a snowy...

Seniq Powder Puff Down Jacket and Bib

Photograph: Kristin Canning

Seniq Powder Puff Down Jacket for $498 and Bib for $398: Seniq is another all-women’s outdoor brand that launched in 2024. It’s styled a little more Gen Z, leaning into fun color blocking over the monochromatic look. The Seniq Powder Puff Down Jacket has a dry-touch finish. It’s meant for drier days on the mountain, but a PFC-free DWR coating and YKK AquaGuard zippers do provide water resistance. The asymmetric front zipper helps you avoid chin rub when you have the jacket fully zipped. It also features cool asymmetrical quilting lines, side pockets-in-pockets that provide access to your bib (their bibs have a pocket on the front, so you can get in there without unzipping your jacket), an oversized removable hood, a forearm pass pocket, soft and stretchy wrist gaiters, and a large internal pocket that can absolutely handle a sandwich. This jacket was warm, pillowy, and comforting, like a super-soft hug.

The silky shell bibs are slightly barrel cut, which gives them a flattering shape without being fitted. The adjustable racer back-style straps and low back (with a stretchy waist) also provide a nice shape and breathability. There are two pockets on the front chest, pockets on either leg, two-way zip thigh vents, and a butt zipper for bathroom breaks. These fit easily over my boots, and the instep guards were a nice touch. With a durable three-layer membrane and a 20,000-mm waterproof rating, these will hold up against any and all weather the mountain throws at you. When I wore them on a wet snowy day, they beaded and sloughed off moisture well. —Kristin Canning

Person wearing a red Mammut Sender In Hooded Jacket with their hands in the pockets and a white door behind them

Mammut Sender In Hooded Jacket

Photograph: Kristin Canning

Mammut Sender In Hooded Jacket for $259: This puffy hoodie is a great mid-layer for under a shell jacket. The insulation is made from recycled rope scraps, and the outer is coated in wind-resistant PFC-free DWR coating. The hem falls at the hips, and the high collar and tight hood keep most of the face covered. I like wearing this piece under shells for snowboarding, but I know it’ll pull double duty as a comfy hiking and camping jacket, too, so it’s a solid multipurpose investment. It’s exceptionally lightweight and warm, though from a volume standpoint, it is on the bulkier side for a mid-layer and isn’t the most packable piece. —Kristin Canning

Helly Hansen Evolved Air Half Zip for $112: This fleece pullover has a waffle-like texture that traps heat and wicks moisture. With a high zippered collar and cinchable hem, you can adjust the fit to make it more air-tight or breathable. This mid-layer felt wonderfully lightweight while still keeping my toasty. It’s not bulky at all, only a little thicker than a base layer, laid comfortably under my jackets, and moved with me on the mountain. —Kristin Canning

Helly Hansen Lifa Base Layer Long-Sleeve Crew for $115 and Pants for $115: These base layers hit the weight sweet spot; they’re not too thick or thin, but just right. They’re slightly looser than other options on this list, so if you prefer something that isn’t so fitted, these are a great pick (but note that they run long too). These combine merino wool with Helly Hansen’s LIFA fibers, which add more moisture-wicking capabilities. They’re soft, lightweight, warm, and don’t hold onto smells. I love the cute designs and how well they regulate my temperature under insulated jackets and pants. The waist digs in a bit but doesn’t roll, and they stay in place and move well. —Kristin Canning

We have a full guide on how to layer, but here are your essentials.

Base layer: A good set of thermals is essential in the fight against cold, especially when you’re working hard. The best fabrics wick away sweat as you heat up, which helps regulate your temperature. Merino wool is the best at this, but also the most expensive. Synthetic fabrics are getting better, though, and please avoid cotton at all costs, as it gets wet and stays that way, making you cold and uncomfortable.

Mid layer: Whether you choose a hooded fleece or puffer-style jacket, this layer does the bulk of the work in cold conditions. Combined with the base layer, it traps warm air in, while also allowing moisture to be expelled. Synthetic insulation such as Primaloft Gold is brilliant and doesn’t lose its properties if it gets wet. Down jackets offer the best warmth-to-weight ratio, but they don’t pack down as small, and should never get wet. A fleece with an insulated vest is a great option if you really feel the cold.

Jacket: While ski jackets with insulation offer bonus warmth in Arctic-like conditions, for most people a waterproof shell will be enough, as it offers protection from both the snow and the wind. A cold wind will chill you to your bones faster than a bit of wet snow. Ideally choose a jacket with a waterproof membrane such as Gore-Tex (make sure it is free from PFAS, or forever chemicals), but also check for taped seams for added waterproofing, plus plenty of pockets for snacks and lift passes, and wrist cuffs and ski skirts to help keep out the snow.

Socks: As with your base layer, socks keep you warm and maintain your temperature when you’re building up a sweat. Natural fabrics work well, but a blend of merino wool with synthetic stretchy fibers is the way to go, as they stay up better and can be used for more than a day. Avoid cotton again, and never wear two pairs, as you’ll almost certainly get colder feet.

Gloves: You’ll be surprised by how wet ski gloves get when it’s snowing, even if you don’t fall very often. As a result, waterproof options work best in most cases, although well-made leather designs can be almost as waterproof as a pair with Gore-Tex. Mittens are generally warmer than gloves, but what you gain in toasty fingers you lose in dexterity. Check out our Best Ski Gloves and Mittens guide for more information.

Waterproofing and breathability ratings: Waterproofing is measured with a hydrostatic head rating, or HH. That means if you put a 1-inch, endlessly long square tube on top of the fabric, you could pour 20,000 millimeters of water before it would seep through. Breathability is rated in how many grams of vapor per square meter can can pass through the fabric in 24 hours.

I’ve been reviewing winter sports gear for more than 15 years. In that time, I have worn an untold number of jackets, pants, mid-layers, thermals, gloves, and mittens. I called on industry experts and professional skiers, and solicited opinions from fellow winter sport enthusiasts on the WIRED team. While a basic fit check can be done in the office, nothing replaces on-mountain testing in variable conditions. We put in the time on various trips to the French Alps, as well as in resorts in Vermont, Colorado, Arizona, and Oregon.

Power up with unlimited access to WIRED. Get best-in-class reporting and exclusive subscriber content that’s too important to ignore. Subscribe Today.



Source link

Continue Reading

Trending