Tech
Interrupting encoder training in diffusion models enables more efficient generative AI
A new framework for generative diffusion models was developed by researchers at Science Tokyo, significantly improving generative AI models. The method reinterpreted Schrödinger bridge models as variational autoencoders with infinitely many latent variables, reducing computational costs and preventing overfitting. By appropriately interrupting the training of the encoder, this approach enabled development of more efficient generative AI, with broad applicability beyond standard diffusion models.
Diffusion models are among the most widely used approaches in generative AI for creating images and audio. These models generate new data by gradually adding noise (noising) to real samples and then learning how to reverse that process (denoising) back into realistic data. A widely used version, the score-based model, achieves this by the diffusion process connecting the prior to the data with a sufficiently long-time interval. This method, however, has a limitation that when the data differs strongly from the prior, the time intervals of the noising and denoising processes become longer, which causes slowing down sample generation.
Now, a research team from Institute of Science Tokyo (Science Tokyo), Japan, has proposed a new framework for diffusion models that is faster and computationally less demanding. They achieved this by reinterpreting Schrödinger bridge (SB) models, a type of diffusion model, as variational autoencoders (VAEs).
The study was led by graduate student Mr. Kentaro Kaba and Professor Masayuki Ohzeki from the Department of Physics at Science Tokyo, in collaboration with Mr. Reo Shimizu (then a graduate student) and Associate Professor Yuki Sugiyama from the Graduate School of Information Sciences at Tohoku University, Japan. Their findings were published in the Physical Review Research on September 3, 2025.
SB models offer greater flexibility than standard score-based models because they can connect any two probability distributions over a finite time using a stochastic differential equation (SDE). This supports more complex noising processes and higher-quality sample generation. The trade-off, however, is that SB models are mathematically complex and expensive to train.
The proposed method addresses this by reformulating SB models as VAEs with multiple latent variables. “The key insight lies in extending the number of latent variables from one to infinity, leveraging the data-processing inequality. This perspective enables us to interpret SB-type models within the framework of VAEs,” says Kaba.
In this setup, the encoder represents the forward process that maps real data onto a noisy latent space, while the decoder reverses the process to reconstruct realistic samples, and both processes are modeled as SDEs learned by neural networks.
The model employs a training objective with two components. The first is the prior loss, which ensures that the encoder correctly maps the data distribution to the prior distribution. The second is drift matching, which trains the decoder to mimic the dynamics of the reverse encoder process. Moreover, once the prior loss stabilizes, encoder training can be stopped early. This allows us to complete learning faster, reducing the risk of overfitting and preserving high accuracy in SB models.
“The objective function is composed of the prior loss and drift matching parts, which characterizes the training of neural networks in the encoder and the decoder, respectively. Together, they reduce the computational cost of training SB-type models. It was demonstrated that interrupting the training of the encoder mitigated the challenge of overfitting,” explains Ohzeki.
This approach is flexible and can be applied to other probabilistic rule sets, even non-Markov processes, making it a broadly applicable training scheme.
More information:
Kentaro Kaba et al, Schrödinger bridge-type diffusion models as an extension of variational autoencoders, Physical Review Research (2025). DOI: 10.1103/dxp7-4hby
Citation:
Interrupting encoder training in diffusion models enables more efficient generative AI (2025, September 29)
retrieved 29 September 2025
from https://techxplore.com/news/2025-09-encoder-diffusion-enables-efficient-generative.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Tech
Oh No! A Free Scale That Tells Me My Stress Levels and Body Fat
I will admit to being afraid of scales—the kind that weigh you, not the ones on a snake. And so my first reaction to the idea I’d be getting a free body-scanning scale with a Factor prepared meal kit subscription was something akin to “Oh no!”
It’s always bad or shameful news, I figured, and maybe nothing I don’t already know. Though, as it turned out, I was wrong on both points.
Factor is, of course, the prepared meal brand from meal kit giant HelloFresh, which I’ve tested while reviewing dozens of meal kits this past year. Think delivery TV dinners, but actually fresh and never frozen. Factor meals are meant to be microwaved, but I found when I reviewed Factor last year that the meals actually tasted much better if you air-fry them (ideally using a Ninja Crispi, the best reheating device I know).
Especially, Factor excels at the low-carb and protein-rich diet that has become equally fashionable among people who want to lose weight and people who like to lift it. Hence, this scale. Factor would like you to be able to track your progress in gaining muscle mass, losing fat, or both. And then presumably keep using Factor to make your fitness or wellness goals.
While your first week of Factor comes at a discount right now, regular-price meals will be $14 to $15 a serving, plus $11 shipping per box. That’s less than most restaurant delivery, but certainly more than if you were whipping up these meals yourself.
If you subscribe between now and the end of March, the third Factor meal box will come with a free Withings Body Comp scale, which generally retails north of $200. The Withings doesn’t just weigh you. It scans your proportions of fat and bone and muscle, and indirectly measures stress levels and the elasticity of your blood vessels. It is, in fact, WIRED’s favorite smart scale, something like a fitness watch for your feet.
Anyway, to get the deal, use the code CONWITHINGS on Factor’s website, or follow the promo code link below.
Is It My Body
The scale that comes with the Factor subscription is about as fancy as it gets: a $200 Body Comp scale from high-tech fitness monitoring company Withings. The scale uses bioelectrical impedance analysis and some other proprietary methods in order to measure not just your weight but your body fat percentage, your lean muscle mass, your visceral fat, and your bone and water mass, your pulse rate, and even the stiffness of your arteries.
To get all this information, all you really need to do is stand on the scale for a few minutes. The scale will recognize you based on your weight (you’ll need to be accurate in describing yourself when you set up your profile for this to work), and then cycle through a series of measurements before giving you a cheery weather report for the day.
Your electrodermal activity—the “skin response via sweat gland stimulation in your feet”—provides a gauge of stress, or at least excitation. The Withings also purports to measure your arterial age, or stiffness, via the velocity of your blood with each heartbeat. This sounds esoteric, but it has some scientific backing.
Note that many physicians caution against taking indirect measurements of body composition as gospel. Other physicians counter that previous “gold standard” measurements aren’t perfectly accurate, either. It’s a big ol’ debate. For myself, I tend to take smart-scale measurements as a convenient way to track progress, and also a good home indicator for when there’s a problem that may require attention from a physician.
And so of course, I was petrified. So much bad news to get all at once! I figured.
Tech
Discovering the Dimensions of a New Cold War
In 2025, American and world leaders were preoccupied with wars in the Middle East. Most dramatically, first Israel and the United States bombed Iran’s nuclear facilities. Some commentators feared that President Trump’s decision to bomb Iran would drag the United States into the “forever wars” in the Middle East that presidential candidate Trump had pledged to avoid. The tragic war in Gaza had become a humanitarian disaster. After years of promising to reduce engagement with the region from Democratic and Republican presidents alike, it appeared that the US was being dragged back into Middle East once again.
I hope that’s not the case. Instead, in 2026, President Trump, his administration, the US Congress, and the American people more generally must realize that the real challenges to the American national interests, the free world, and global order more generally come not from the Middle East but from the autocratic China and Russia. The three-decade honeymoon from great power politics after the collapse of the Soviet Union and the end of the Cold War is over. For the United States to succeed in this new era of great power competition, US strategists must first accurately diagnose the threat and then devise and implement effective prescriptions.
The oversimplified assessment is that we have entered a new Cold War with Xi’s China and his sidekick, Russian leader Vladimir Putin. To be sure, there are some parallels between our current era of great power competition and the Cold War. The balance of power in the world today is dominated by two great powers, the United States and China, much like the United States and the Soviet Union dominated the world during the Cold War. Second, like the contest between communism and capitalism during the last century, there is an ideological conflict between the great powers today. The United States is a democracy. China and Russia are autocracies. Third, at least until the second Trump era, all three of these great powers have sought to propagate and expand their influence globally. That too was the case during the last Cold War.
At the same time, there are also some significant differences. Superimposing the Cold War metaphor to explain everything regarding the US-China rivalry today distorts as much as it illuminates.
First, while the world is dominated by two great powers, the United States remains more powerful than China on many dimensions of power—military, economic, ideological—and especially so when allies are added to the equation. Also different from the Cold War, several mid-level powers have emerged in the global system—Brazil, India, Indonesia, Saudi Arabia, and South Africa, among others—that are not willing to join exclusively the American bloc or the Chinese bloc.
Second, while the ideological dimension of great power competition is real, it is not as intense as the Cold War. The Soviets aimed to spread communism worldwide, including in Europe and the United States. They were willing to deploy the Red Army, provide military and economic assistance, overthrow regimes, and fight proxy wars with the United States to achieve that aim. So far, Xi Jinping and the Communist Party of China have not employed these same aggressive methods to export their model of governance or construct an alternative world order. Putin is much more aggressive in propagating his ideology of illiberal nationalism and seeking to destroy the liberal international order. Thankfully, however, Russia does not have the capabilities of China to succeed in these revisionist aims.
Tech
Walmart Promo Codes for December 2025
After living in big cities like San Francisco and New York, when I set foot in Wally World in the Midwest, I heard angels sing. Rows and rows of fluorescent lights highlighted any and every product needed for your house in one place. Screw the mom-and-pop bodega—I missed this level of convenience. If by chance they don’t have what you need in-store, there’s even more online, with pickup and delivery available.
Save $10 off With our Limited-Time Walmart Promo Code
Skip the line at your local Walmart and save $10 off your first three delivery or pickup orders of $50 or more with our Walmart coupon code, TRIPLE10. So, whether you’re stocking up on late night munchies or some toiletries for your next getaway, you can take $10 off your next purchase now until the end of the year.
No Walmart Coupon? No Problem.
Walmart has quite literally thousands of flash deals that change weekly, with up to 65% off tech, appliances, end-of-season, and holiday items, so be sure to check often to find the best rotating deals. And if you’re like me, I’m always searching for the best tech deals without breaking the bank. So whether you’re looking to purchase a new 17-piece non-stick cookware set, Dyson cordless vacuum cleaner, or this season’s latest clothing trends for men, women or children—Walmart is your one-stop shop for it all.
You can also enjoy great benefits with Walmart+, a paid membership that gives early access to promotions and events like Walmart Black Friday deals, free delivery, free shipping with no order minimum, savings on fuel, streaming with Paramount+, and more. You can pay monthly or annually, and you’ll get a free trial of Walmart+ for 30 days to try it out. Walmart+ Assist helps qualifying government aid recipients get a membership at a lower cost.
-
Sports5 days agoBrooks Koepka should face penalty if he rejoins PGA Tour, golf pundit says
-
Business6 days agoGovt registers 144olive startups | The Express Tribune
-
Politics5 days agoThailand, Cambodia agree to ‘immediate’ ceasefire: joint statement
-
Entertainment5 days agoSecond actor accuses Tyler Perry of sexual assault in new lawsuit
-
Politics5 days agoHeavy rains, flash floods leave Southern California homes caked in mud
-
Fashion5 days agoClimate change may hit RMG export earnings of 4 nations by 2030: Study
-
Fashion1 week agoCostlier cotton lifts PC yarn in India; polyester & viscose stable
-
Entertainment6 days agoInside royal families most private Christmas moments
