The Ultrahuman Home is a futuristic-looking home environment monitor that tracks air quality, light, sound, and temperature. All this data flows into the Ultrahuman app on your phone, offering potential insights into your environment and suggestions on how you could make it healthier. Sadly, this mostly amounts to reminders to crack a window open, because most of the touted features are not yet present and correct, despite the rather hefty $550 price.
Ultrahuman made its name with a subscription-free smart ring that made biohacking more affordable (though it may soon be banned in the US due to a lawsuit from Oura). The Home monitor may seem like a strange sidestep, but if you’re going to hack your body, why not your environment? After all, we know air quality, light and sound exposure, and temperature and humidity can impact our sleep and general health.
Setup and Tracking
Photograph: Simon Hill
Taking a leaf from Apple’s playbook, the Ultrahuman Home is a 4.7-inch anodized aluminum block with rounded corners (it looks like a Mac Mini). There’s an Ultrahuman logo and light sensor on top, a power button and LED on the front, and a USB-C port on the back flanked by privacy switches to turn off the microphone or connectivity (Wi-Fi and Bluetooth).
Setup is super simple: Plug it in and add it via the Ultrahuman app. The Home gets its own tab at the bottom of the Ultrahuman app, alongside the ring, and if you tap on it, you’ll get a score out of 100, indicating how healthy your environment is. Scroll down for a breakdown of the four scores that combine to create your overall Home score (air quality, environmental comfort, light exposure, and UV exposure).
Ultrahuman via Simon Hill
Ultrahuman via Simon Hill
To compile all this data, the Ultrahuman Home is packed with sensors:
Air quality sensors to track things like volatile organic compounds (VOCs), typically released by cleaning fluids, and carbon dioxide levels (CO₂) that might indicate poor ventilation. They also watch out for formaldehyde (HCHO), carbon monoxide (CO), and smoke.
Particulate matter sensors to track tiny particles in the air, including things like dust, pollen, mold spores, and particles released by cooking. Covering PM1.0, PM2.5, and PM10 (the number refers to the size in microns), the Home warns if you’re in danger of breathing these particles in.
Temperature and humidity sensors to track how warm or cool it is and how much moisture is in the air. You get a chart of the temperature in your environment and the humidity level.
Light sensors to track the level of light and also its makeup, including the amount of blue light and ultraviolet (UV) exposure.
Microphones to track the noise levels in your environment, showing noise in decibels in a chart.
Ultrahuman via Simon Hill
The data is all easy to access and read in the app. You get notifications throughout the day, including alerts if VOC levels spike or there’s prolonged noise. I set the Home up in my office for a few weeks and then tried it for another couple of weeks in my bedroom, after I moved houses. This raises the issue of where to put it, because it must be plugged in and isn’t really designed to be moved around. The bedroom seems like the best bet, but you ideally want both, though I can’t imagine springing for two or more of these to cover all your bases.
Oversensitive and Alarming
Photograph: Simon Hill
The idea of combining body and environment tracking data seems smart, but the Ultrahuman Home doesn’t really do it yet. The touted UltraSync with the Ultrahuman Ring Air is limited to basic common sense advice for now. I don’t think anyone really needs a box to tell them they will sleep better in the dark and quiet, and the air quality advice mostly amounts to opening a window for better ventilation.
Running with wet feet, in wet socks, in wet shoes is the perfect recipe for blisters. It’s also a fast track to low morale. Nothing dampens spirits quicker than soaked socks. On ultra runs, I always carry spares. And when faced with wet, or even snowy, mid-winter miles, the lure of weatherproof shoes is strong. Anything that can stem the soggy tide is worth a go, right?
This isn’t as simple an answer as it sounds. In the past, a lot of runners—that includes me—felt waterproof shoes came with too many trade-offs, like thicker, heavier uppers that change the feel of your shoes or a tendency to run hot and sweaty. In general, weatherproof shoes are less comfortable.
But waterproofing technology has evolved, and it might be time for a rethink. Winterized shoes can now be as light as the regular models, breathability is better, and the comfort levels have improved. Brands are also starting to add extra puddle protection to some of the most popular shoes. So it’s time to ask the questions again: Just how much difference does a bit of Gore-Tex really make? Are there still trade-offs for that extra protection? And is it really worth paying the premium?
I spoke to the waterproofing pros, an elite ultra runner who has braved brutal conditions, and some expert running shoe testers. Here’s everything you need to know about waterproof running shoes in 2026. Need more information? Check out our guide to the Best Running Shoes, our guide to weatherproof fabrics, and our guide to the Best Rain Jackets.
Jump To
How Do Waterproof Running Shoes Work?
On a basic level, waterproof shoes add extra barriers between your nice dry socks and the wet world outside. If you’re running through puddles deep enough to breach your heel collars, you’re still going to get wet feet. But waterproof shoes can protect against rain, wet grass, snow, and smaller puddles.
Gore-Tex is probably the most common waterproofing tech in footwear, but it’s not the only solution in town. Some brands have proprietary tech, or you might come across alternative systems like eVent and Sympatex. That GTX stamp is definitely the one you’re most likely to encounter, so here’s how GTX works.
The water resistance comes from a layered system that is composed of a durable water repellent (DWR) coating to the uppers with an internal membrane, along with other details like taped seams, more sealed uppers with tighter woven mesh, gusseted tongues, and higher, gaiter-style heel collars.
The UK government is launching a call for evidence on how technology, changing market dynamics and regulation are shaping investment in mobile networks.
The call for evidence was introduced as an important step in securing a “comprehensive” view of how the UK mobile market was changing, and identifying what more can be done to support investment, innovation and competition for the benefit of consumers and business. It will look to assess the impact of factors affecting investment in high-quality connectivity by 2030, identify actions to support the sector to achieve government objectives over the next decade, and assess how the regulatory framework can be improved to support investment, innovation and competition.
As part of this, the government is announcing an action plan based on four key principles: drive investment in comprehensive, high-quality connectivity by 2030; deliver for consumers; support innovation and growth across the economy; and provide secure and resilient connectivity.
Introducing the call, Liz Lloyd, parliamentary under-secretary of state at the Department for Science, Innovation and Technology and minister for digital economy, said that in an era of rapid technological transformation, new technologies and wireless services were critical to day-to-day lives, the economy and society in general.
Lloyd added that digital infrastructure is the core enabler of this transformation, and that it was crucial the UK’s telecommunications networks were ready for the future. She stressed that mobile and other digital networks, such as fibre networks, will drive growth and innovation across the country, deliver modern public services, increasingly underpin critical national infrastructure, and be essential for ensuring people everywhere were digitally included.
To that end, she said, its ambition remains for all populated areas to have access to higher-quality standalone 5G by 2030, and the immediate challenge was to secure investment to deliver this ambition by 2030, driving digital inclusion and ensuring business could depend on the connectivity that underpins modern life.
“Our coverage ambition goes hand in hand with affordability of access so that everyone can carry out essential online activities and, aligned with the government’s tech adoption agenda, supports take-up of premium 5G-enabled services across the economy,” said Lloyd.
Looking forward, Lloyd said the government must also anticipate how the mobile market – and technologies that underpin it – will evolve, and what this means for its objectives over the next decade, shaping a framework that supports innovation, investment and the needs of future users.
In its action plan, the minister referenced the digital inclusion action plan, in which access to secure and reliable connectivity was seen as the foundation to ensuring that people everywhere can get online. That said, delivering these benefits was dependent on substantial investment in mobile networks.
To date, the UK mobile network operators have been investing heavily in the country’s mobile networks, averaging £2bn annually between 2020 and 2024. In particular, as a result of the merger between the two component parties, VodafoneThree has committed to investing £11bn in creating its merged network, while competitors BTEE and Virgin Media O2 have also planned to invest in upgrading their networks. For example, BTEE has an ambition to deliver standalone 5G to 99% of the population by the end of 2030.
Lloyd assured that the UK government would support industry to deliver this investment, including through removing barriers to deployment and ensuring digital connectivity is appropriately considered and built into new infrastructure projects from the outset. However, she warned that the UK mobile sector stands at a critical inflection point of rapid market changes, coupled with persistent investment challenges.
Lloyd said governments and regulators across the globe are considering how their telecoms policies and regulatory frameworks can best drive innovation and investment in this new era. That, she emphasised, is why it is necessary to act immediately to understand the challenges, safeguard the UK’s international competitiveness, and deliver the high-quality, nationwide connectivity the UK relies on.
The call to action and the four-point plan were designed to realise the potential of the mobile sector, and the UK government said it recognised that doing so would require concerted and coordinated action across government and industry, to deliver the coverage needed in this decade and shape the mobile market for the future.
The government said that, in creating its call to evidence, it welcomed responses from across the ecosystem, including mobile operators, infrastructure providers, technology companies, local authorities, public sector bodies, civil society organisations, academia and investors.The call will run until 11:59pm on 21 April 2026.
A popular topic of conversation of late has been the existence of an artificial intelligence (AI) bubble and the likelihood that it will burst with great detriment to the IT industry as a whole. Yet, and perhaps surprisingly, the impact of a bursting bubble on digital twins might not be as problematic as one might think.
Ready adoption and fast diffusion of AI might warrant the tremendous investment flows of past years and could create revenue and profit streams quickly. We might as well be standing on the precipice of a bubble popping that will lead to sweeping valuation corrections. But digital twins stand to benefit from advancing AI either way. That said, the timeline of AI-enabled applications of digital twins might move, however.
Since the start of 2023, AI-related company valuations have ballooned. OpenAI is often attributed with starting the AI frenzy when releasing ChatGPT at the end of 2022. The company was valued at $29bn in 2023 and reached $500bn in October 2025, with observers wondering if the company can pull off a $1tn initial public offering soon.
AI chip leader Nvidia’s stock, meanwhile, multiplied by 13 between the beginning of 2023 and the end of October 2025, making it the first $5tn company ever. Even companies that are related but not at the centre of AI developments have increased substantially in value, with the stock price of Microsoft and Alphabet more than doubling and tripling, respectively, during that time period.
AI encompasses many different types of technologies and has many use cases, so it should be seen as an enabling technology rather than a sole application or a market per se. AI will play a major role across most application areas, but to varying degrees. Much like the way the internet shaped past decades – and will continue to shape coming decades – AI will transform industries for good in the long term. Potential potholes on the path that create setbacks are only par for the course.
Looking back to gaze ahead
It is worthwhile recalling the dot com era from the end of the last century to judge AI’s current hype. The Nasdaq Composite index – a stock index that skews toward information-technology companies – peaked at more than 5,100 points in March 2000 and then rapidly declined to a final low of barely above 1,100 points in October 2002. It took more than 12 years to move beyond 5,000 points again.
Likely, a crash is in the making. Similarly to 2000, a bursting bubble does not mean AI will go away, as internet-enabled companies and business models did not vanish. On the contrary, AI will flourish as the internet did. In fact, many infrastructure elements, such as datacentres, will become affordable for general use after lofty valuations come down.
During the late 1990s, the construction of fibre communication networks was perceived as a promising business opportunity. The business never became as profitable as expected, but the initial excitement created an infrastructure of dark fibre – unused but readily available communication lines – that supports today’s business models as a commodity that can be readily leveraged.
AI as an enabling technology will boost capabilities and accelerate the use of advanced digital twins. In particular, digital twins that have to work with difficult-to-capture data and not-completely-understood real-world dynamics will benefit tremendously. Digital twins of machinery can rely on a solid understanding of physics and measurable data that sensors can capture cost-effectively.
Factory environments have many known equipment dynamics and interactions – even workers’ likely movement patterns can be plugged into simulations. But urban digital twins attempt to capture the dynamics and behaviours of relevant elements across entire cities. They are not only subject to less understood dynamics, but also phenomena that are difficult – often impossible – to measure. AI can make available data usable and create additional data of unmeasurable phenomena.
AI in digital twins also allows the use of scenarios to better prepare for sudden events that can affect the entire system in unexpected ways. For example, city managers might use it to develop strategies for unusual weather events, pandemic-like occurrences, or localised industrial accidents with ripple effects across the urban landscape.
Digital twins and AI to plan for tomorrow’s cities
Digital twins of urban environments are difficult to design, implement and maintain, but the potential commercial and societal impact such digital twins can have promises to be substantial. Because of the number of parameters, intersecting dynamics and range of conceivable scenarios, the benefits AI can provide in understanding urban environments are considerable. AI and digital twins reinforce each other.
Digital twins of urban environments are difficult to design, implement and maintain, but the potential commercial and societal impact such digital twins can have promises to be substantial
AI can speed up the building of digital twins by supporting code development for virtual environments. Such applications accelerate overall design development and allow design details to be embedded more easily. For clients and users, AI reduces costs, enables faster implementation of digital twins, and allows for quick and inexpensive changes and alterations as requirements change or new needs arise. In addition, AI can improve the interface experience between virtual environments, as well as simulations of operations and users.
Ari Lightman, a professor at Carnegie Mellon University, explains: “Generative AI would be used to look at the entire simulation and turn it into a summary for humans. It could tell me things I might be missing and summarise things in a way I can understand.”
AI doesn’t only benefit digital twins, but digital twins also support AI’s capabilities. Scott Likens, emerging technology leader at PwC, points out: “We’re using digital twins to generate information for large language models…. We see an opportunity to have the digital twins generate the missing pieces of data we need, and it’s more in line with the environment because it’s based on actual data.”
Nvidia serves the market of smart cities as city planners and managers “are turning to digital twins and AI agents for urban planning scenario analysis and data-driven operational decisions”. The company is providing a range of solutions to enable users to create photo-realistic, simulation-ready digital twins of urban environments to optimise city operations.
A partnership of Japanese companies is developing the digital entertainment city Namba in Osaka, Japan. The aim is to “create the world’s first smart city that integrates artificial intelligence, extended reality and decentralised physical infrastructure networks [a blockchain-based approach to manage decentralised networks] on a city-wide scale”. The group intends to offer services beyond entertainment and tourism. Namba is a neighbourhood within Osaka, thereby limiting the claim to city-wide application of the concept.
The silver lining of AI over-investment
The existence of an AI investment bubble is increasingly perceived as a foregone conclusion. AI companies and technology suppliers are now even investing in each other’s operations, adding to lofty valuations. There are obvious indications of a bubble, but positive effects can emerge from the current investment excitement. Whatever the outcome, applications for digital twins will see their timeline solidify as the immediate future of AI plays out.
Over-investment in fibre during the dot com years ended up creating dark fibre – overbuilt fibre cables for data transmission. This infrastructure has served as a ready and inexpensive resource ever since. For AI, investment in datacentres is comparable to the fibre investment from 30 years ago
If use of AI applications proves to be an all-encompassing and rapidly growing market opportunity, the immense investment of the past couple of years will be retroactively viewed as forward-looking wisdom that locked in favourable competitive positions and profits for years to come. More likely, though, investors have outrun their headlights, and expectations of adoption and diffusion of AI applications over the next few years are overrated.
If so, there will be a shock to the system, like the burst of the dot com bubble at the beginning of the century when the Nasdaq Composite Index dropped by almost 80% within 30 months. Initial warnings existed – the former chairman of the Federal Reserve used the phrase “irrational exuberance” when discussing the development of the stock market in December 1996. Warnings of an exuberant AI bubble are common today.
Bursting investment bubbles hurt investors and bring down many companies. Indeed, 25 years ago, a slew of dot com companies vanished. But related over-investment in infrastructure can make assets suddenly affordable, opening up new opportunities. Such affordability changes cost structures that enable business models that could not have become successful at previous valuations. Infrastructure overhang – infrastructure built for rapid growth that does not materialise in the short run – leads to commodification of infrastructure elements, which can democratise a technology for incumbents and startups alike.
The over-investment in fibre during the dot com years ended up creating dark fibre – overbuilt fibre cables for data transmission. This infrastructure has served as a ready and inexpensive resource ever since. For AI, investment in datacentres is comparable to the fibre investment from 30 years ago. Morgan Stanley analysts forecast datacentre spending globally of up to almost $3tn between now and 2028. The amount is staggering, and it is difficult to imagine use cases and adoption rates that will provide the required return on investment for any business model. But as initial investors see their investments decrease or vanish, new players can snap up or use related infrastructures at bargain prices.
Alkesh Shah, a tech analyst at the Bank of America, explains the underlying reason for such recurring dynamics: “You always over-estimate how fast the change will happen. And you underestimate the magnitude of the change.”
The impact digital twins will have on the marketplace will follow a similar dichotomy between today’s expectations of rate of change and tomorrow’s impact of such change. Digital twins require many technological bits and pieces to come together. And AI will play an important role in digital twins, if not tomorrow, then the day after tomorrow.