Tech
Essex Police halts live facial recognition over bias and accuracy risks | Computer Weekly
Essex Police has paused its use of live facial recognition (LFR) technology after identifying potential accuracy and bias risks.
The force’s suspension of its LFR system – provided by Israeli biometrics firm Corsight – was revealed in an audit document published by the Information Commissioner’s Office (ICO), which said Essex Police must work to “reduce the risks” identified before continuing with future deployments.
A list of LFR deployments from Essex Police shows the last time the force used the technology was on 26 August 2025, meaning its deployments had already been paused by the time the ICO carried out its audit that November.
While it is currently unclear what specifically prompted the force to suspend its LFR use, Computer Weekly exclusively reported in May 2025 that Essex Police had failed to properly consider its potentially discriminatory impacts, after a “clearly inadequate” equality impact assessment (EIA) was obtained via Freedom of Information rules by privacy campaign group Big Brother Watch.
Experts criticised the document at the time for being “incoherent”, failing to look at the systemic equalities impacts of the technology, and relying exclusively on testing of entirely different software algorithms used by other police forces trained on different populations.
The force was also criticised for “parroting misleading claims” from the supplier about the LFR system’s lack of bias, with the National Institute of Standards and Technology – a body widely recognised as the gold standard for LFR testing, where all of the testing data is publicly shared – holding no information to support the accuracy figures cited by Corsight, or its claim to essentially have the least-biased algorithm available.
Big Brother Watch alleged at the time that these issues taken together meant the force had likely failed to fulfil its public sector equality duty to consider how its policies and practices could be discriminatory.
Independent testing
Responding to the criticisms, the force said at the time that it was continuing to carry out evaluations, noting that both the National Physical Laboratory (NPL) and Cambridge University had been commissioned to conduct further independent testing of its system.
According to the results of that Cambridge study – published on 12 March 2026 – the system was more likely to correctly identify men than women, and was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.
Matt Bland, a criminologist involved in the study, said: “If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation.”
By contrast, the further NPL testing – also published in March 2026 – found black men were most likely to be correctly matched by the system and white men least likely, but noted that the disparity was not statistically significant.
Computer Weekly contacted the force to ask what specifically prompted the LFR suspension decision, including whether it was the study results or previous criticisms of the EIA.
“In line with our commitment to our Public Sector Equality Duty, Essex Police commissioned two independent studies which were completed by academia,” a spokesperson said. “The first of these indicated there was a potential bias in the positive identification rate, while the second suggested there was no statistical relevant bias in the results.
“Based on the fact there was potential bias, the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software,” they added. “We then sought further academic assessment.
“As a result of this work, we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. We will continue to monitor all results to ensure there is no risk of bias against any one section of the community.”
Responding to news of the suspension, Jake Hurfurt, the head of research and investigations at Big Brother Watch, said: “Police across the country must take note of this fiasco. AI [artificial intelligence] surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.”
Ramping up deployments without debate
While the use of LFR by police – beginning with the Met’s deployment at Notting Hill Carnival in August 2016 – has already ramped up in recent years, there has so far been minimal public debate or consultation, with the Home Office claiming for years that there is already “comprehensive” legal framework in place.
However, in December 2025, the Home Office launched a 10-week consultation on the use of LFR by UK police, allowing interested parties and members of the public to share their views on how the controversial technology should be regulated.
The department has said that although a “patchwork” legal framework for police facial recognition exists (including for the increasing use of the retrospective and “operator-initiated” versions of the technology), it does not give police themselves the confidence to “use it at significantly greater scale … nor does it consistently give the public the confidence that it will be used responsibly”.
It added that the current rules governing police LFR use are “complicated and difficult to understand”, and that an ordinary member of the public would be required to read four pieces of legislation, police national guidance documents and a range of detailed legal or data protection documents from individual forces to fully understand the basis for LFR use on their high streets.
Before the consultation had even closed, however, the Home Office announced plans for the massive roll-out of AI and facial-recognition technologies as part of sweeping reforms to the UK’s “broken” policing system.
Under the proposals – announced in late January 2026, nearly three weeks before the consultation closed – the Home Office will increase the number of LFR vans available to police from 10 to 50; set up a new National Centre for AI in Policing – to be known as Police.AI – to build, test and assure AI models for policing contexts; and invest £115m over three years to help identify, test and scale new AI technologies in policing.
‘Panopticon’ vision
In a recent interview with former prime minister Tony Blair, UK home secretary Shabana Mahmood described her ambition to use technologies such as AI and LFR to achieve Jeremy Bentham’s vision of a “panopticon”, referring to his proposed prison design that would allow a single, unseen guard to silently observe every prisoner at once.
Typically used today as a metaphor for authoritarian control, the underpinning idea of the panopticon is that by instilling a perpetual sense of being watched among the inmates, they would behave as the authorities wanted.
“When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his panopticon,” Mahmood told Blair. “That is that the eyes of the state can be on you at all times.”
Tech
A Top Democrat Is Urging Colleagues to Support Trump’s Spy Machine
United States congressman Jim Himes, the ranking Democrat on the House Intelligence Committee, is privately lobbying colleagues to preserve the FBI’s power to conduct warrantless searches of Americans’ communications, WIRED has learned, arguing that he has seen no evidence that the Trump administration is abusing its authority.
In a letter obtained by WIRED, Himes urges fellow Democrats to support the White House’s request to renew a controversial surveillance program that intercepts the electronic data of foreigners abroad. While targeted at foreigners, the program—authorized under Section 702 of the Foreign Intelligence Surveillance Act—also sweeps in vast quantities of private messages belonging to US citizens.
Himes’ pitch relies on the “56 reforms” passed by Congress in 2024, which codified the FBI’s own internal protocols as a substitute for constitutional warrants. In the letter, Himes claims these changes are “working as intended” to prevent domestic misuse, citing a compliance rate “exceeding 99 percent” over the past two years.
The structural foundations of that defense, however, have been fundamentally altered by recent changes within the FBI. Himes’ “99 percent” compliance metric was produced by the Office of Internal Auditing, for instance—a unit that long served as a smoke alarm designed to detect illegality, but no longer exists.
The unit was shuttered by FBI director Kash Patel last year. Historic court opinions based on its data had previously exposed hundreds of thousands of improper FBI searches. Without the auditors required to calculate failure rates, the compliance mechanisms Himes points to have effectively ceased to function.
In a statement, Himes’ office largely reiterated the positions laid out in his letter to colleagues. “I am open to making further reforms to Section 702, building on the many successful reforms we made in reauthorization legislation two years ago,” he says. “A short-term reauthorization of Section 702 will enable Congress to thoroughly debate the pros and cons of these suggested reforms—and to determine if compromise is possible—without placing our national security in peril by allowing the program to expire.”
As a member of the so-called Gang of Eight—a bipartisan group of lawmakers who are briefed on highly sensitive classified information—Himes possesses some of the deepest knowledge of the spy program. Nevertheless, his letter contains several other claims that appear fundamentally at odds with the mechanics of FISA oversight.
“Because of how heavily it is overseen by all three branches of government,” Himes says, “any effort to misuse the program would almost certainly become known to the Foreign Intelligence Surveillance Court and to Congress.”
The Foreign Intelligence Surveillance Court is a secret court that possesses no investigative arm to audit FBI databases. Similar to Congress, its oversight role is purely reactive, relying entirely on the US Justice Department to self-report violations.
“Neither Congress nor the FISA Court conducts independent audits of the FBI’s queries,” says Liza Goitein, senior director of the Brennan Center’s Liberty and National Security Program. “They rely on the Department of Justice to conduct thorough audits and to report the results truthfully and promptly. This particular Department of Justice has gutted internal oversight mechanisms and has been rebuked by dozens of federal courts for providing inaccurate, misleading, or incomplete information.”
There are no judges standing between the FBI and the private communications of millions of Americans, something that Himes and other members of his committee claim is necessary for the government to react quickly to terrorist threats. Critics argue that, given the current administration’s efforts to dismantle internal checks at the FBI, this is a massive vulnerability, leaving Americans exposed to surveillance abuses that will take years to declassify—if they’re ever reported at all.
Tech
Lasers, robots, action: MIT workshop explores Raman spectroscopy
Could a three-hour workshop on an advanced materials analysis technique turn someone into a detective — or perhaps an art restorer?
At MIT’s Center for Bits and Atoms in late January, about a dozen students explored that possibility during an Independent Activities Period (IAP) workshop on Raman spectroscopy, a technique that uses laser light to “fingerprint” materials. The session even featured a robotic dog equipped with sensing equipment, demonstrating how chemical analysis can be done remotely.
The workshop, led by MIT postdoc Lamyaa Almehmadi in collaboration with the CBA, introduced participants to a powerful technique now used by law enforcement and first responders to identify narcotics and explosives, by gemologists to authenticate precious stones, and pharmaceutical companies to verify raw materials and ensure product quality. CBA graduate researcher Jiaming Liu co-hosted, delivering lectures, demonstrating Raman equipment, and contributing to the curriculum and hands-on demonstrations.
“It can open up new possibilities for innovation across many fields,” said Almehmadi, an analytical chemist in the Department of Materials Science and Engineering (DMSE). After attendees learned the fundamentals, she encouraged them to think creatively about new applications: “My hope is to inspire all of you to think about doing something with Raman spectroscopy that no one has done before.”
Fingerprinting materials
Participants brought items to class to analyze using handheld devices, which fire laser light and measure how it bounces back. The resulting pattern behaves like a molecular fingerprint, identifying the materials in the item — whether it’s a paper clip, a piece of tree bark, or a mixing bowl.
Workshop attendee Sarah Ciriello, an administrative assistant at DMSE who brought a stone she found at the beach, was taken aback by the results. The Raman device suggested a 39 percent probability that the sample contained concrete-like material, with the remaining readings matching synthetic compounds — blurring the line between natural and manufactured materials.
“It’s man-made — I was surprised,” Ciriello said.
Developed in 1928 by Indian scientist C.V. Raman, who later won the Nobel Prize in Physics, Raman spectroscopy was groundbreaking because it used visible light to probe materials without destroying them, a major advantage over other techniques at the time, such as chromatography or mass spectrometry. But for decades, the Raman signal — the light scattered back from a sample — was weak, and the instruments were big and bulky, limiting its practical use.
Advances in lasers, computing power, and miniaturized optics have transformed Raman spectroscopy into a portable tool. Today’s handheld devices can instantly compare a sample’s molecular fingerprint against vast digital libraries, allowing users to identify thousands of materials in seconds. Because it doesn’t destroy the sample, Raman is especially useful in fields that require preserving materials — such as law enforcement, where evidence must remain intact, and art restoration.
Almehmadi’s own research focuses on advancing Raman spectroscopy by developing highly sensitive, semiconductor-based sensors that make portable chemical analysis possible, with applications ranging from medical diagnostics to forensic and environmental monitoring.
“Raman can be used to analyze any material,” Almehmadi says. “That’s why I decided to introduce it to students from diverse backgrounds.”
IAP classes are open to students and staff across MIT, and the Raman workshop reflected that range — from administrative staff to graduate and undergraduate students and postdocs in departments and labs including DMSE, the Department of Mechanical Engineering, the Media Lab, and the Broad Institute.
Walking the robot dog
A crowd-pleasing element in the workshop was the integration of a robot dog that belongs to the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). The demonstration highlighted how Raman technology can be used in dangerous environments, such as crime scenes or toxic industrial sites.
The handheld device was secured to the robot using tape, and Almehmadi showed how she could navigate the dog to a plastic bag filled with a white powder — baking soda.
But in a real-world scenario, “How can we know if it is baking soda or not?” she says. “So we just shined the light, and then the instrument told us what it was.”
Participants used a Wi-Fi app on their phones to view the results and a small remote controller to operate the robotic dog themselves.
“I loved the robot dog,” Ciriello says. “I was able to control it a bit, but it was challenging because the gauge was really sensitive.”
Michael Kitcher, a postdoc in DMSE, also praises the robot demonstration.
“Given that we just duct taped the device onto the dog — it was cool to see it actually worked,” he says.
Looking ahead
Kitcher, who researches magnetic materials for electronic applications, joined the workshop to learn more about Raman spectroscopy, which he had read about but never used. He was impressed by its versatility — in addition to the beach stone and baking soda, the device identified materials in a contact lens, cosmetics, and even a diamond.
Although it struggled to analyze a piece of chocolate he brought — other signals from the chocolate interfered — Kitcher sees strong potential for his own research. One area he’s interested in is unconventional magnetic materials, such as altermagnets, with unusual magnetic behavior that researchers hope to better understand and control for more energy-efficient electronics.
“Over the last couple of years, people have been trying to get a better sense of why these materials behave the way they do — how we can control this unconventional magnetic order,” he says. Raman spectroscopy can probe the vibrations of atoms in a material, helping researchers detect patterns in the crystal structure that underlie unusual magnetic behaviors. By understanding these vibrations, scientists could unlock material design rules that enable ultra-fast, low-energy computing.
Hands-on workshops like this — that inspire innovative future applications — Almehmadi says, are at the heart of an MIT education.
“I’ve always learned best by doing,” she says. “Lectures and reading are important, but real understanding comes from hands-on experience.”
Tech
Gamers Hate Nvidia’s DLSS 5. Developers Aren’t Crazy About It, Either
After a day of widespread, overwhelming pushback, Nvidia CEO Jensen Huang doubled down and said gamers are “completely wrong” about DLSS. (You know how much gamers love being told that they’re wrong.) But developers at Capcom and Ubisoft say they didn’t even know what the tech demo would look like and, according to Insider Gaming, found out about it the same time everyone else did and were just as surprised. (Nvidia, Ubisoft, and Capcom did not immediately get back to our request for comment.)
“I think the reaction from gamers is understandable,” Marwan Mahmoud, a game developer at Incrypt, wrote in an email to WIRED. “Some games started relying too heavily on these technologies instead of focusing on proper optimization. From a developer perspective, it feels a bit different because you see DLSS as a tool that helps rather than a core solution.”
The problem for many people, developers included, is the one-size-fits-all approach of a technology that can adjust visuals across various game types.
“The artist has a style, the artist has an art direction that you’re going to give him, and that’s something that AI kind of doesn’t respect all the time,” says Raúl Izquierdo, an indie game developer in Mexico, “Maybe I don’t want my characters to be yassified.”
Bates agrees, saying he doesn’t think every game needs to be photo real. And that sentiment is also echoed by game developer Sterling Reames, who has worked at Striking Distance Studios and Zynga. “People just want better games,” Reames wrote in a message to WIRED. “That’s as plainly as I can put it.
At GTC, Nvidia ran its demo on its most powerful consumer graphics cards, two GeForce RTX 5090s. Had Nvidia made its selling point for the tech that it saves resources, thus enabling older hardware to deliver more impressive graphics, there may have been something to that.
“What’s the point if you’re not going to do it on weaker hardware?” Izquierdo says. “If this were done on an [RTX 2080 graphics card], for instance, I think I would be thinking differently about it. OK, this is for the betterment of gamers’ experiences and everything, not just for selling graphic cards.”
Ultimately, Nvidia’s demo, and GTC writ large, was a flex of the company’s power in the AI space. The reaction, Bates posits, is more about humans dealing with not just crossing the uncanny valley, but what happens when we reach the other side.
“Right now it’s pretty clearly a thing they are forced to do to demonstrate their prowess as an AI company,” Bates says. “But the truth is, this is going to be the default in a few years, and nobody is even going to think twice about it. It’s Jensen’s world, we’re just living in it.”
-
Business1 week agoStock market crash today (March 12, 2026): Nifty50 opens below 23,600; BSE Sensex down over 900 points on continuing US-Iran war – The Times of India
-
Business1 week agoUS issues 30-day waiver for Russian oil shipments stranded at sea | The Express Tribune
-
Fashion1 week agoUS unemployment rate 4.4% in Feb 2026, LFPR 62%: BLS
-
Tech1 week agoTech Traveler’s Guide to Dumbo: Where to Stay, Eat, and Recharge
-
Fashion1 week agoChinese firm to invest $15.34 million in garment factory at BEPZA
-
Tech6 days agoTips and Advice for Buying Used or Refurbished Electronics
-
Business1 week agoOil holds above $100 as tensions escalates between Iran, US and Israel – SUCH TV
-
Sports1 week agoBilas’ All-America teams: My top 20 men’s college basketball players of 2025-26
