Connect with us

Tech

Super-sensitive sensor detects tiny hydrogen leaks in seconds for safer energy use

Published

on

Super-sensitive sensor detects tiny hydrogen leaks in seconds for safer energy use


Zeng’s lab creates sensors that help protect the health of people and the environment. Credit: Yvonne Groner

Researchers at the University of Missouri are working to make hydrogen energy as safe as possible. As more countries and industries invest heavily in cleaner, renewable energy, hydrogen-powered factories and vehicles are gaining in popularity. But hydrogen fuel comes with risks—leaks can lead to explosions, accidents and environmental harm. Most hydrogen-detecting sensors on the market are expensive, can’t operate continuously and aren’t sensitive enough to detect tiny leaks quickly.

That’s why researcher Xiangqun Zeng and her team in the College of Engineering set out to design the ideal hydrogen sensor, focusing on six traits: sensitivity, selectivity, speed, stability, size and cost.

In a recent study published in the journal ACS Sensors, they unveiled a prototype of an affordable, longer-lasting, super-sensitive sensor that can accurately detect even the tiniest hydrogen leaks within seconds. The best part? It’s incredibly small, measuring about the size of a fingernail.

Zeng created her sensor by mixing tiny crystals made of platinum and nickel with . Compared to what’s already on the market, the new sensor is unmatched in performance and durability.

Mizzou at the forefront of using hydrogen energy safely
Zeng’s sensors are both highly sensitive and selective. Credit: Yvonne Groner

“Hydrogen can be tricky to detect since you can’t see it, smell it or taste it,” said Zeng, a MizzouForward hire who creates sensors to protect the health of people and the environment. “In general, our goal is to create sensors that are smaller, more affordable, highly sensitive and work continuously in real time.”

While her new hydrogen sensor is still being tested in the lab, Zeng hopes to commercialize it by 2027. Mizzou is committed to furthering this impactful research, as prioritizing renewable energy will be a cornerstone of the new Energy Innovation Center, expected to open on Mizzou’s campus in 2028.

Creating improved sensors with broad applications in health care, energy and the environment has been Zeng’s mission throughout her career.

“My expertise is in developing next-generation measurement technology, and for more than 30 years, I have prioritized projects that can make the biggest impacts on society,” said Zeng, who also has an appointment in the College of Arts and Science. “If we are going to develop sensors that can detect explosive gases, it needs to be done in real time so we can help people stay as safe as possible.”

More information:
Xiaojun Liu et al, PtNi Nanocrystal–Ionic Liquid Interfaces: An Innovative Platform for High-Performance and Reliable H2 Detection, ACS Sensors (2025). DOI: 10.1021/acssensors.4c03564

Citation:
Super-sensitive sensor detects tiny hydrogen leaks in seconds for safer energy use (2025, September 3)
retrieved 3 September 2025
from https://techxplore.com/news/2025-09-super-sensitive-sensor-tiny-hydrogen.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Tech

An Amazon outage has rattled the internet. A computer scientist explains why the ‘cloud’ needs to change

Published

on

An Amazon outage has rattled the internet. A computer scientist explains why the ‘cloud’ needs to change


Credit: Jonathan Borba from Pexels

The world’s largest cloud computing platform, Amazon Web Services (AWS), has experienced a major outage that has impacted thousands of organizations, including banks, financial software platforms such as Xero, and social media platforms such as Snapchat.

The outage began at roughly 6pm AEDT on Monday. It was caused by a malfunction at one of AWS’ data centers located in Northern Virginia in the United States. AWS says it has fixed the underlying issue but some are still reporting service disruptions.

This incident highlights the vulnerabilities of relying so much on —or “the cloud” as it’s often called. But there are ways to mitigate some of the risks.

Renting IT infrastructure

Cloud computing is the on-demand delivery of diverse IT resources such as computing power, database storage, and applications over the internet. In simple terms, it’s renting (not owning) your own IT infrastructure.

Cloud computing came into prevalence with the dot com boom in the late 1990s, wherein digital tech companies started to deliver software over the internet. As companies such as Amazon matured in their own ability to offer what’s known as “software as a service” over the web, they started to offer others the ability to rent their virtual servers for a cost as well.

This was a lucrative value proposition. Cloud computing enables a pay-as-you-go model similar to a utility bill, rather than the huge upfront investment required to purchase, operate and manage your own data center.

As a result, the latest statistics suggest more than 94% of all enterprises use cloud-based services in some form.

A market dominated by three companies

The global cloud market is dominated by three companies. AWS holds the largest share (roughly 30%). It’s followed by Microsoft Azure (about 20%) and Google Cloud Platform (about 13%).

All three service providers have had recent outages, significantly impacting digital service platforms. For example, in 2024, an issue with third-party software severely impacted Microsoft Azure, causing extensive operational failures for businesses globally.

Google Cloud Platform also experienced a major outage this year due to an internal misconfiguration.

Profound risks

The heavy reliance of the global internet on just a few major providers—AWS, Azure, and Google Cloud—creates profound risks for both businesses and everyday users.

First, this concentration forms a single point of failure. As seen in the latest AWS event, a simple configuration error in one central system can trigger a domino effect that instantly paralyzes vast segments of the internet.

Second, these providers often impose vendor lock-in. Companies find it prohibitively difficult and expensive to switch platforms due to complex data architectures and excessively high fees charged for moving large volumes of data out of the cloud (data egress costs). This effectively traps customers, leaving them hostage to a single vendor’s terms.

Finally, the dominance of US-based cloud service providers introduces geopolitical and regulatory risks. Data stored in these massive systems is subject to US laws and government demands, which can complicate compliance with international data sovereignty regulations such as Australia’s Privacy Act.

Furthermore, these companies hold the power to censor or restrict access to services, giving them control over how firms operate.

The current best practice to mitigate these risks is to adopt a multi-cloud approach that enables you to decentralize. This involves running critical applications across multiple vendors to eliminate the single point of failure.

This approach can be complemented by what’s known as ““, wherein data storage and processing is moved away from large, central data centers, toward smaller, distributed nodes (such as local servers) that firms can control directly.

The combination of edge computing and a multi-cloud approach enhances resilience, improves speed, and helps companies meet strict data regulatory requirements while avoiding dependence on any single entity.

As the old saying goes, don’t put all of your eggs in one basket.

Provided by
The Conversation


This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation:
An Amazon outage has rattled the internet. A computer scientist explains why the ‘cloud’ needs to change (2025, October 21)
retrieved 21 October 2025
from https://techxplore.com/news/2025-10-amazon-outage-rattled-internet-scientist.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Continue Reading

Tech

Sperm From Older Men Have More Genetic Mutations

Published

on

Sperm From Older Men Have More Genetic Mutations


Human semen not only accumulates genetic mutations with age; as the percentage of sperm carrying potentially serious mutations increases, so does the risk of developing diseases in offspring.

This is according to a new study by researchers at the Sanger Institute and King’s College London. The team sequenced semen samples from individuals between the ages of 24 and 75, using very high-precision technologies, and found that the male germ line (the line of cells that produce sperm) is subject to a combination of mutation and positive selection.

The scientists used a duplex sequencing technique called NanoSeq, which can detect rare mutations with a very low margin of error. This allowed them to analyze 81 sperm samples from 57 donors. The results showed that a man’s sperm adds an average of 1.67 new mutations every year.

But the most striking aspect of the study is not limited to the mere accumulation of mutations with age. The authors discovered that the male germ line is subject to positive selection. That is, certain mutations offer an advantage to cells that produce sperm and expand. They identified that many of these mutations are in genes related to developmental disorders or a predisposition to childhood cancer.

“We expected to find evidence that selection influences mutations in sperm,” said Matthew Neville, coauthor of the study published this month in the journal Nature. “What surprised us was how much the number of sperm carrying mutations associated with serious diseases increases.”

What Does This Mean for Children of Older Fathers?

The researchers estimated that about 3 to 5 percent of sperm from middle-aged and older men carry some potentially pathogenic mutation in the exome (the coding part of the genome). That represents a higher risk than previous estimates. In more concrete numbers, the estimated fraction for men in their thirties was close to 2 percent, while it reached about 4.5 percent for men in their seventies.

From the evolutionary and clinical perspective, the implications are significant. Evolutionarily, it shows that the male germ line is not simply a “machine” that accumulates errors: There is a dynamic process of mutation and selection that can modify the genetic “quality” of the sperm with the age of the father.

On the clinical side, however, it raises questions about reproductive planning, genetic counseling, and the additional risks associated with an older father. The authors argue that although the percentages remain modest, the the accumulation is not only linear but also has a selection component that favors mutations with the potential to spread.



Source link

Continue Reading

Tech

Forget SEO. Welcome to the World of Generative Engine Optimization

Published

on

Forget SEO. Welcome to the World of Generative Engine Optimization


This holiday season, rather than searching on Google, more Americans will likely be turning to large language models to find gifts, deals, and sales. Retailers could see up to a 520 percent increase in traffic from chatbots and AI search engines this year compared to 2024, according to a recent shopping report from Adobe. OpenAI is already moving to capitalize on the trend: Last week, the ChatGPT maker announced a major partnership with Walmart that will allow users to buy goods directly within the chat window.

As people start relying on chatbots to discover new products, retailers are having to rethink their approach to online marketing. For decades, companies tried to game Google’s search results by using strategies known collectively as search engine optimization, or SEO. Now, in order to get noticed by AI bots, more brands are turning to “generative engine optimization,” or GEO. The cottage industry is expected to be worth nearly $850 million this year, according to one market research estimate.

GEO, in many ways, is less a new invention than the next phase of SEO. Many GEO consultants, in fact, came from the world of SEO. At least some of their old strategies likely still apply since the core goal remains the same: anticipate the questions people will ask and make sure your content appears in the answers. But there’s also growing evidence that chatbots are surfacing different kinds of information than search engines.

Imri Marcus, chief executive of the GEO firm Brandlight, estimates that there used to be about a 70 percent overlap between the top Google links and the sources cited by AI tools. Now, he says, that correlation has fallen below 20 percent.

Search engines often favor wordiness—think of the long blog posts that appear above recipes on cooking websites. But Marcus says that chatbots tend to favor information presented in simple, structured formats, like bulleted lists and FAQ pages. “An FAQ can answer a hundred different questions instead of one article that just says how great your entire brand is,” he says. “You essentially give a hundred different options for the AI engines to choose.”

The things people ask chatbots are often highly specific, so it’s helpful for companies to publish extremely granular information. “No one goes to ChatGPT and asks, ‘Is General Motors a good company?’” says Marcus. Instead, they ask if the Chevy Silverado or the Chevy Blazer has a longer driving range. “Writing more specific content actually will drive much better results because the questions are way more specific.”



Source link

Continue Reading

Trending