Connect with us

Tech

Oh No! A Free Scale That Tells Me My Stress Levels and Body Fat

Published

on

Oh No! A Free Scale That Tells Me My Stress Levels and Body Fat


I will admit to being afraid of scales—the kind that weigh you, not the ones on a snake. And so my first reaction to the idea I’d be getting a free body-scanning scale with a Factor prepared meal kit subscription was something akin to “Oh no!”

It’s always bad or shameful news, I figured, and maybe nothing I don’t already know. Though, as it turned out, I was wrong on both points.

Factor is, of course, the prepared meal brand from meal kit giant HelloFresh, which I’ve tested while reviewing dozens of meal kits this past year. Think delivery TV dinners, but actually fresh and never frozen. Factor meals are meant to be microwaved, but I found when I reviewed Factor last year that the meals actually tasted much better if you air-fry them (ideally using a Ninja Crispi, the best reheating device I know).

Especially, Factor excels at the low-carb and protein-rich diet that has become equally fashionable among people who want to lose weight and people who like to lift it. Hence, this scale. Factor would like you to be able to track your progress in gaining muscle mass, losing fat, or both. And then presumably keep using Factor to make your fitness or wellness goals.

While your first week of Factor comes at a discount right now, regular-price meals will be $14 to $15 a serving, plus $11 shipping per box. That’s less than most restaurant delivery, but certainly more than if you were whipping up these meals yourself.

If you subscribe between now and the end of March, the third Factor meal box will come with a free Withings Body Comp scale, which generally retails north of $200. The Withings doesn’t just weigh you. It scans your proportions of fat and bone and muscle, and indirectly measures stress levels and the elasticity of your blood vessels. It is, in fact, WIRED’s favorite smart scale, something like a fitness watch for your feet.

Anyway, to get the deal, use the code CONWITHINGS on Factor’s website, or follow the promo code link below.

Is It My Body

The scale that comes with the Factor subscription is about as fancy as it gets: a $200 Body Comp scale from high-tech fitness monitoring company Withings. The scale uses bioelectrical impedance analysis and some other proprietary methods in order to measure not just your weight but your body fat percentage, your lean muscle mass, your visceral fat, and your bone and water mass, your pulse rate, and even the stiffness of your arteries.

To get all this information, all you really need to do is stand on the scale for a few minutes. The scale will recognize you based on your weight (you’ll need to be accurate in describing yourself when you set up your profile for this to work), and then cycle through a series of measurements before giving you a cheery weather report for the day.

Withings

Body Comp Smart Scale

Your electrodermal activity—the “skin response via sweat gland stimulation in your feet”—provides a gauge of stress, or at least excitation. The Withings also purports to measure your arterial age, or stiffness, via the velocity of your blood with each heartbeat. This sounds esoteric, but it has some scientific backing.

Note that many physicians caution against taking indirect measurements of body composition as gospel. Other physicians counter that previous “gold standard” measurements aren’t perfectly accurate, either. It’s a big ol’ debate. For myself, I tend to take smart-scale measurements as a convenient way to track progress, and also a good home indicator for when there’s a problem that may require attention from a physician.

And so of course, I was petrified. So much bad news to get all at once! I figured.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Discovering the Dimensions of a New Cold War

Published

on

Discovering the Dimensions of a New Cold War


In 2025, American and world leaders were preoccupied with wars in the Middle East. Most dramatically, first Israel and the United States bombed Iran’s nuclear facilities. Some commentators feared that President Trump’s decision to bomb Iran would drag the United States into the “forever wars” in the Middle East that presidential candidate Trump had pledged to avoid. The tragic war in Gaza had become a humanitarian disaster. After years of promising to reduce engagement with the region from Democratic and Republican presidents alike, it appeared that the US was being dragged back into Middle East once again.

I hope that’s not the case. Instead, in 2026, President Trump, his administration, the US Congress, and the American people more generally must realize that the real challenges to the American national interests, the free world, and global order more generally come not from the Middle East but from the autocratic China and Russia. The three-decade honeymoon from great power politics after the collapse of the Soviet Union and the end of the Cold War is over. For the United States to succeed in this new era of great power competition, US strategists must first accurately diagnose the threat and then devise and implement effective prescriptions.

The oversimplified assessment is that we have entered a new Cold War with Xi’s China and his sidekick, Russian leader Vladimir Putin. To be sure, there are some parallels between our current era of great power competition and the Cold War. The balance of power in the world today is dominated by two great powers, the United States and China, much like the United States and the Soviet Union dominated the world during the Cold War. Second, like the contest between communism and capitalism during the last century, there is an ideological conflict between the great powers today. The United States is a democracy. China and Russia are autocracies. Third, at least until the second Trump era, all three of these great powers have sought to propagate and expand their influence globally. That too was the case during the last Cold War.

At the same time, there are also some significant differences. Superimposing the Cold War metaphor to explain everything regarding the US-China rivalry today distorts as much as it illuminates.

First, while the world is dominated by two great powers, the United States remains more powerful than China on many dimensions of power—military, economic, ideological—and especially so when allies are added to the equation. Also different from the Cold War, several mid-level powers have emerged in the global system—Brazil, India, Indonesia, Saudi Arabia, and South Africa, among others—that are not willing to join exclusively the American bloc or the Chinese bloc.

Second, while the ideological dimension of great power competition is real, it is not as intense as the Cold War. The Soviets aimed to spread communism worldwide, including in Europe and the United States. They were willing to deploy the Red Army, provide military and economic assistance, overthrow regimes, and fight proxy wars with the United States to achieve that aim. So far, Xi Jinping and the Communist Party of China have not employed these same aggressive methods to export their model of governance or construct an alternative world order. Putin is much more aggressive in propagating his ideology of illiberal nationalism and seeking to destroy the liberal international order. Thankfully, however, Russia does not have the capabilities of China to succeed in these revisionist aims.



Source link

Continue Reading

Tech

Walmart Promo Codes for December 2025

Published

on

Walmart Promo Codes for December 2025


After living in big cities like San Francisco and New York, when I set foot in Wally World in the Midwest, I heard angels sing. Rows and rows of fluorescent lights highlighted any and every product needed for your house in one place. Screw the mom-and-pop bodega—I missed this level of convenience. If by chance they don’t have what you need in-store, there’s even more online, with pickup and delivery available.

Save $10 off With our Limited-Time Walmart Promo Code

Skip the line at your local Walmart and save $10 off your first three delivery or pickup orders of $50 or more with our Walmart coupon code, TRIPLE10. So, whether you’re stocking up on late night munchies or some toiletries for your next getaway, you can take $10 off your next purchase now until the end of the year.

No Walmart Coupon? No Problem.

Walmart has quite literally thousands of flash deals that change weekly, with up to 65% off tech, appliances, end-of-season, and holiday items, so be sure to check often to find the best rotating deals. And if you’re like me, I’m always searching for the best tech deals without breaking the bank. So whether you’re looking to purchase a new 17-piece non-stick cookware set, Dyson cordless vacuum cleaner, or this season’s latest clothing trends for men, women or children—Walmart is your one-stop shop for it all.

You can also enjoy great benefits with Walmart+, a paid membership that gives early access to promotions and events like Walmart Black Friday deals, free delivery, free shipping with no order minimum, savings on fuel, streaming with Paramount+, and more. You can pay monthly or annually, and you’ll get a free trial of Walmart+ for 30 days to try it out. Walmart+ Assist helps qualifying government aid recipients get a membership at a lower cost.



Source link

Continue Reading

Tech

Top 10 police technology stories of 2025 | Computer Weekly

Published

on

Top 10 police technology stories of 2025 | Computer Weekly


In 2025, Computer Weekly’s police technology coverage focused extensively on developments in the use of data-driven technologies such as facial recognition and predictive policing.

This included stories on the Met’s decision to deploy permanent live facial recognition (LFR) cameras in Croydon and the Home Office launching a formal consultation on laws to regulate its use, as well as reports highlighting the lawfulness, necessity and proportionality of how UK police are using the technology.

Further stories continued Computer Weekly’s ongoing coverage of police hyperscale cloud use, after documents obtained from Scottish policing bodies revealed that Microsoft is refusing to hand them critical information about its data flows.

Computer Weekly also reported on efforts to change police data protection rules, which essentially legalise previously unlawful practices and pose a risk to the UK’s law enforcement data adequacy with the European Union (EU).

One investigation by freelance journalists Apostolis Fotiadis, Giacomo Zandonini and Luděk Stavinoha also revealed how the EU’s law enforcement agency has been quietly amassing data to feed an ambitious-but-secretive artificial intelligence (AI) development programme.

The Home Office formally opened a consultation on the use of facial recognition by UK police at the start of December 2025, saying the government is committed to introducing a legal framework that sets out clear rules for the technology.

The move – initially announced by policing minister Sarah Jones in early October 2025 after then home secretary Yvette Cooper told a Lords Committee in July that the UK government will create “a proper, clear governance framework” to regulate police use of the tech – marks a distinct shift in Home Office policy, which for years has claimed there is already “comprehensive” legal framework in place.

The Home Office has now said that although a “patchwork” legal framework for police facial recognition exists (including for the increasing use of the retrospective and “operator-initiated” versions of the technology), it does not give police themselves the confidence to “use it at significantly greater scale … nor does it consistently give the public the confidence that it will be used responsibly”.

It added that the current rules governing police LFR use are “complicated and difficult to understand”, and that an ordinary member of the public would be required to read four pieces of legislation, police national guidance documents and a range of detailed legal or data protection documentation from individual forces to fully understand the basis for LFR use on their high streets.

While the use of LFR by police – beginning with the Met’s deployment at Notting Hill Carnival in August 2016 – has ramped up massively in recent years, there has so far been minimal public debate or consultation.

UK police forces are “supercharging racism” through their use of automated “predictive policing” systems, as they are based on profiling people or groups before they have committed a crime, according to a 120-page report published by Amnesty International.

While proponents claim these systems can help more efficiently direct resources, Amnesty highlighted how predictive policing tools are used to repeatedly target poor and racialised communities, as these groups have historically been “over-policed” and are therefore massively over-represented in police data sets.

This then creates a negative feedback loop, where these so-called “predictions” lead to further over-policing of certain groups and areas; reinforcing and exacerbating the pre-existing discrimination as increasing amounts of data are collected.

“The use of predictive policing tools violates human rights. The evidence that this technology keeps us safe just isn’t there, the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores,” said Sacha Deshmukh, chief executive at Amnesty International UK, adding that these systems are deciding who is a criminal based “purely” on the colour of their skin or their socio-economic background.

In June 2025, Green Party MP Siân Berry argued in the Commons that “predictive” policing technologies infringe human rights “at their heart” and should be prohibited in the UK, after tabling an amendment to the government’s forthcoming Crime and Policing Bill.

Highlighting the dangers of using predictive policing technologies to assess the likelihood of individuals or groups committing criminal offences in the future, Berry said that “such technologies, however cleverly sold, will always need to be built on existing, flawed police data … That means that communities that have historically been over-policed will be more likely to be identified as being ‘at risk’ of future criminal behaviour.”

Berry’s amendment would also prohibit the use of certain information by UK police to “predict” people’s behaviour: “Police forces in England and Wales shall be prohibited from … Predicting the occurrence or reoccurrence of an actual or potential criminal offence based on profiling of a natural person or on assessing personality traits and characteristics, including the person’s location, or past criminal behaviour of natural persons or groups of natural persons.”

In April, the Met Police announced it was planning to install the UK’s first permanent LFR cameras in Croydon, but critics raised concerns that this continues the force’s pattern of deploying the technology in areas where the Black population is much higher than the London average.

Local councillors also complained that the decision to set up facial recognition cameras permanently has taken place without any community engagement from the force with local residents, echoing situations that have happened in boroughs such as Newham and Lewisham.

According to data gathered by Green Party London Assembly member Zoë Garbett, over half of the 180 LFR deployments that took place during 2024 were in areas where the proportion of Black residents is higher than the city’s average, including Lewisham and Haringey.

While Black people comprise 13.5% of London’s total population, the proportion is much higher in the Met’s deployment areas, with Black people making up 36% of the Haringey population, 34% of the Lewisham population, and 40.1% of the Croydon population.

“The Met’s decision to roll out facial recognition in areas of London with higher Black populations reinforces the troubling assumption that certain communities … are more likely to be criminals,” she said, adding that while nearly two million people in total had their faces scanned across the Met’s 2024 deployments, only 804 arrests were made – a rate of just 0.04%.

In March 2025, Computer Weekly reported that proposed reforms to police data protection rules could undermine law enforcement data adequacy with the European Union (EU).

During the committee stage of Parliamentary scrutiny, the government’s Data Use and Access Bill (DUAB) – now an act – sought to amend the UK’s implementation of the EU Law Enforcement Directive (LED), which is transposed into UK law via the current Data Protection Act (DPA) 2018 and represented in Part Three of the DPA, specifically.

In combination with the current data handling practices of UK law enforcement bodies, the bill’s proposed amendments to Part Three – which include allowing the routine transfer of data to offshore cloud providers, removing the need for police to log justifications when accessing data, and enabling police and intelligence services to share data outside of the LED rules – could present a challenge for UK data adequacy.

In June 2021, the European Commission granted “data adequacy” to the UK following its exit from the EU, allowing the free flow of personal data to and from the bloc to continue, but warned the decision may yet be revoked if future data protection laws diverge significantly from those in Europe.

While Computer Weekly’s previous reporting on police hyperscale cloud use has identified major problems with the ability of these services to comply with Part Three, the government’s DUAB changes are seeking to solve the issue by simply removing the requirements that are not being complied with.

To circumvent the lack of compliance with these transfer requirements, the government has simply dropped them from the DUAB, meaning policing bodies will no longer be required to assess the suitability of the transfer or report it to the data regulator.

In August, Computer Weekly reported on documents obtained from the Scottish Police Authority (SPA), which showed that Microsoft is refusing to tell Scottish policing bodies where and how the sensitive law enforcement data uploaded to its cloud services will be processed.

Citing “commercial confidentiality”, the tech giant’s refusal to hand over crucial information about its international data flows to the SPA and Police Scotland means the policing bodies are unable to satisfy the law enforcement-specific data protection rules laid out in Part Three of the Data Protection Act 2018 (DPA18), which places strict limits on the transfer of policing data outside the UK.

“MS is unable to specify what data originating from SPA will be processed outside the UK for support functions,” said the SPA in a detailed data protection impact assessment (DPIA) created for its use of O365. “To try and mitigate this risk, SPA asked to see … [the transfer risk assessments] for the countries used by MS where there is no [data] adequacy. MS declined to provide the assessments.”

The SPA DPIA also confirms that, on top of refusing to provide key information, Microsoft itself has told the police watchdog it is unable to guarantee the sovereignty of policing data held and processed within its O365 infrastructure.

Further revelations published by Computer Weekly a month later showed that policing data hosted in Microsoft’s hyperscale cloud infrastructure could be processed in more than 100 countries.

This information was not provided to the policing bodies by Microsoft, and only came to light because of an analysis conducted by independent security consultant Owen Sayers, who identified from the tech giant’s own distributed online documentation that Microsoft personnel or contractors can remotely access the data from 105 different countries, using 148 different sub-processors.

Although the documentation – which is buried in non-indexed, difficult-to-find web pages – has come to light in the context of Computer Weekly investigating police cloud use, the issue of routine data transfers in Microsoft’s cloud architecture affects the whole of the UK government and public sector, which are obliged by the G-Cloud and Tepas frameworks to ensure data remains in the UK by default.

According to multiple data protection litigation experts, the reality of Microsoft’s global data processing here, on top of its failure to meet key Part Three obligations, means data subjects could have grounds to successfully claim compensation from Police Scotland or any other force using hyperscale cloud infrastructure.

In November 2025, freelance journalists Apostolis Fotiadis, Giacomo Zandonini and Luděk Stavinoha published an extensive investigation into how the EU’s law enforcement agency has been quietly amassing data to feed an ambitious-but-secretive AI development programme.

Based on internal documents obtained from Europol, and analysed by data protection and AI experts, the investigation raised serious questions about the implications of the agency’s AI programme for people’s privacy across the bloc. 

It also raised questions about the impact of integrating automated technologies into everyday policing across Europe without adequate oversight.

In May 2025, Computer Weekly reported on an equality impact assessment that Essex Police had created for its use of live facial recognition, but the document itself – obtained under Freedom of Information rules by privacy group Big Brother Watch and shared exclusively with Computer Weekly – was plagued with inconsistencies and poor methodology.

The campaigners told Computer Weekly that, given the issues with the document, the force had likely failed to fulfil its public sector equality duty (PSED) to consider how its policies and practices could be discriminatory.

They also highlighted how the force is relying on false comparisons to other algorithms and “parroting misleading claims” from the supplier about the LFR system’s lack of bias.

Other experts noted the assessment was “clearly inadequate”, failed to look at the systemic equalities impacts of the technology, and relied exclusively on testing of entirely different software algorithms used by other police forces trained on different populations to justify its conclusions.

After being granted permission to intervene in a judicial review of the Met’s LFR use – brought by anti-knife campaigner Shaun Thompson, wrongly stopped by officers after a false LFR identification – the UK’s equality watchdog said the forces’ use of the tech is unlawful.

Highlighting how the Met is failing to meet key legal standards with its deployments – particularly around Articles 8 (right to privacy), 10 (freedom of expression) and 11 (freedom of assembly and association) of the European Convention on Human Rights – the UK’s the Equality and Human Rights Commission (EHRC) said LFR should only be used where necessary, proportionate and constrained by appropriate safeguards.

“We believe that the Metropolitan Police’s current policy falls short of this standard,” said EHRC chief John Kirkpatrick.

The EHRC further highlighted how, when used on a large scale, even low-error rates can affect a significant number of people by brining unnecessary and unwanted police attention, and warned that its use at protests could have a “chilling effect” on people’s freedom of expression and assembly.

Senior police officers from both the Met and South Wales Police have previously argued that a major benefit of facial-recognition technology is its “deterrence effect.”

comparative study of LFR trials by law enforcement agencies in London, Wales, Berlin and Nice found that although “in-the-wild” testing is an important opportunity to collect information about how AI-based systems like LFR perform in real-world deployment environments, the police trials conducted so far have failed to take into account the socio-technical impacts of the systems in use, or to generate clear evidence of the operational benefits.

Highlighting how real-world testing of LFR systems by UK and European police is a largely ungoverned “Wild West”, the authors expressed concern that “such tests will be little more than ‘show trials’ – public performances used to legitimise the use of powerful and invasive digital technologies in support of controversial political agendas for which public debate and deliberation is lacking, while deepening governmental reliance on commercially developed technologies which fall far short of the legal and constitutional standards which public authorities are required to uphold”.

Given the scope for interference with people’s rights, the authors – Karen Yeung, an interdisciplinary professorial fellow in law, ethics and informatics at Birmingham Law School, and Wenlong Li, a research professor at Guanghua Law School, Zhejiang University – said that evidence of the technology’s effectiveness in producing its desired benefits “must pass an exceptionally high threshold” if police want to justify its use.

They added that without a rigorous and full accounting of the technology’s effects – which is currently not taking place in either the UK or Europe – it could lead to the “incremental and insidious removal” of the conditions that underpin our rights and freedoms.



Source link

Continue Reading

Trending