Connect with us

Tech

Microsoft users warned over privilege elevation flaw | Computer Weekly

Published

on

Microsoft users warned over privilege elevation flaw | Computer Weekly


Microsoft marked the penultimate Patch Tuesday of 2025 with an update lighter than of late, addressing a mere 63 common vulnerabilities and exposures (CVEs) across its product estate – a far cry from many of its recent drops averaging well over 100 – and a solitary zero-day flaw.

Tracked as CVE-2025-62215, this month’s single zero-day is an elevation of privilege (EoP) vulnerability in the Windows Kernel that sits at the core of Microsoft’s operating system. It carries a CVSS score of just 7.0, and is not rated critical in its severity, however, exploitation has been observed in the wild, although no public proof-of-concept has yet been released.

Ben McCarthy, lead cyber security engineer at Immersive, explained that the root cause of the issue stems from two combined weaknesses one a race condition in which more than one process tries to access shared data and change it concurrently, the other a double free memory management error.

“An attacker with low-privilege local access can run a specially crafted application that repeatedly attempts to trigger this race condition,” he explained. “The goal is to get multiple threads to interact with a shared kernel resource in an unsynchronised way, confusing the kernel’s memory management and causing it to free the same memory block twice.

“This successful double-free corrupts the kernel heap, allowing the attacker to overwrite memory and hijack the system’s execution flow.”

McCarthy added: “Organisations must prioritise applying the patch for this vulnerability. While a 7.0 CVSS score might not always top a patch list, the active exploitation status makes it a critical priority. A successful exploit grants the attacker System privileges, allowing them to completely bypass endpoint security, steal credentials, install rootkits, and perform other malicious actions. This is a critical link in an attacker’s post-exploitation playbook.”

In the real world, said Mike Walters, president and co-founder of Action1, there are three core business impacts that would potentially arise from a successful compromise via CVE-2025-62215. Walters highlighted the possibility of mass credential exposure arising from the compromise of critical file servers, lateral movement and ransomware deployment, and regulatory, financial and reputational harm from data leakage or other operational disruption.

“Exploitation is complex,” he noted, “but a functional exploit seen in the wild raises urgency, since skilled actors can reliably weaponise this in targeted campaigns.”

Also high on the agenda for November is CVE-2025-60724 an RCE vulnerability in Graphics Device Interface Plus (GDI+), which carries a CVSS score of 9.8. GDI+ is a relatively low-level component but is responsible for rendering 2D graphics, images and text and therefore provides core functionality multiple Microsoft applications – and countless third-party programs, too.

Adam Barnett, Rapid7 lead software engineer, said this was as close to a zero-day as it was possible to get and likely to affect just about every asset running Microsoft software.

“In the worst-case scenario, an attacker could exploit this vulnerability by uploading a malicious document to a vulnerable web service,” he said.

“The advisory doesn’t spell out the context of code execution, but if all the stars align for the attacker, the prize could be remote code execution as System via the network without any need for an existing foothold. While this vuln almost certainly isn’t wormable, it’s clearly very serious and is surely a top priority for just about anyone considering how to approach this month’s patches.”

Action1’s Walters added: “This is emergency-level: a network-reachable RCE with no user interaction and low attack complexity is among the most dangerous bugs. Server compromise, tenant impact in multi-tenant systems, and the potential for rapid mass exploitation make this a top priority. 

“Exploitation may take time to perfect because attackers must build reliable allocator and interpreter manipulations that bypass mitigations like CFG, ASLR, and DEP. Still, GDI+ and image parsing bugs have a history of being weaponised quickly.”

Critically acclaimed bugs

Finally, the docket for security teams this month includes four critical vulnerabilities, highlighted by Dustin Childs of Trend Micro’s Zero Day Initiative (ZDI). These are CVE-2025-30398, a third-party information disclosure flaw in Nuance PowerScribe 360; CVE-2025-60716, an EoP flaw in DirectX Graphics Kernel; CVE-2025-62199, an RCE flaw in Microsoft Office; and CVE-2025-62214, another RCE flaw in Visual Studio.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Our Favorite Travel and Outdoor Gear Is on Sale at Huckberry

Published

on

Our Favorite Travel and Outdoor Gear Is on Sale at Huckberry


Huckberry, purveyors of finely curated clothing and gear for the sort of person equally at home in the woods and the city, is having one of the company’s rare site-wide sales this week—or pretty close to site-wide. We’ve tested and love quite a bit of Huckberry’s stuff, especially the Proof 72-hour merino T-shirt. If you buy nothing else this year, buy that. Trust me. Check out the other deals, which we’ve rounded up below.

Great Deals on our Favorite Travel Clothes

Courtesy of Huckberry

Proof

72-Hour Merino T-Shirt

Huckberry’s Proof 72-hour merino T-shirt is our favorite merino wool T-shirt. The cut and style are not overly sporty, making it more versatile than some others, from everyday wear around town to a trip to the gym. Mine is still soft even after six months of wear and washing. At 87 percent 150 gsm superfine merino wool (16.5 micron) and 13 percent nylon, this T-shirt makes a great starter for those new to merino wool—there’s enough nylon that’s stretchy, and not the least bit itchy.

These pants are the companion piece to the 72-hour shirt above. There’s quite a bit less wool here though. The breakdown is 47 percent merino wool, 33 percent nylon, 14 percent polyester, and 6 percent elastane. The result is a much stretchier fabric than the t-shirt, which still provides a good amount of moisture-wicking and the anti-odor properties of merino. My only gripe with these is that they feel synthetic. What I love about them is the stain resistance. Yes, that DWR coating that gives them that stain resistance will wear off, but it’s not too hard to rejuvenate it.

When I travel, these are the pants I wear. They’re light, comfy, stretchy, and weigh next to nothing. They’re 98 percent cotton, with 2 percent Spandex to give them a little stretch. Unlike jeans, these have enough flex that you can easily do squats in them. It’s possible that translates to some stretching out over time, but I’ve been wearing mine for going on a year now and they still fit perfectly.

I love this jacket. It’s the only jacket I’ve ever worn that anyone has complimented me on, which is also the case for another WIRED staffer. Waxed canvas is definitely heavy, but it stands up very well to wear. I’ve had my Trucker Jacket for well over a year and it still looks like new. I don’t need to re-wax it yet, but I have re-waxed other things and it’s dead simple to do. There’s also a wool-lined version, which I have not tried but I do kinda wish I had that instead of the flannel. It’s on sale as well.

Deals on Backpacks, Coffee Brewers, and Other Gear

Our Favorite Travel and Outdoor Gear Is on Sale at Huckberry

Courtesy of Huckberry

GoRucks are awesome backpacks, but they aren’t cheap. Here’s a chance to get the GoRuck GR1 for a bit less. This is a collaboration between GoRuck and Huckberry, with branding from both companies on the pack. My favorite thing about the GR1 is its versatility. I have used this pack for plane travel (as a carry-on), rucking, hiking, hauling camera gear, and more. I even strapped it to the back rack of my bike for an overnight bikepacking trip. If you want to ruck with it, grab a weight plate as well.

The Yeti Hopper Flip 12 is a nice little personal-size cooler. Hopper Flip 12 closes with a water proof zipper, which has never leaked on my thus far. With 12 quarts of capacity, it’s not huge. Think a six pack and sandwich, depending on what you use to keep things cold (ice packs are the way to go with this one).

This isn’t a huge discount, but any time you can save some money on Snow Peak it’s a win. The company’s incredibly well-designed gear isn’t cheap. Take this mug, which amounts to a $47 coffee mug. But look, it’s titanium, OK? And it’s double-walled so your coffee stays warm even on those bitter cold mornings at the cabin. (Ed. note: These are editor Adrienne So’s camping mugs and she’s used them for about 10 years now.)

If you’re going to get the mug, you might as well get the French press too.

You see where we’re going here—mug, brewer, and now grinder. Yes, this is a $140 (on sale!) military-grade aluminum and high-carbon stainless steel burr grinder, which, I know, that’s a lot, This is also hands down the best most reliable hand grinder I’ve ever used. Mine is five years old and has stood up to the abuse of years and years of travel without missing a beat. It’s missing a little paint, but otherwise works exactly like the day I got it. On sale, I might add.

Peak Design Everyday bag

Photograph: Peak Design

The Everyday Backpack is one of our favorite camera bags, but it doesn’t have to be that. It’s really just a nice EDC backpack with some well thought out features, like a tuck-away waist strap, three FlexFold dividers, and a nice strap for attaching it to the handle of your rolling carry on bag.


Power up with unlimited access to WIRED. Get best-in-class reporting and exclusive subscriber content that’s too important to ignore. Subscribe Today.



Source link

Continue Reading

Tech

Anthropic’s Claude Takes Control of a Robot Dog

Published

on

Anthropic’s Claude Takes Control of a Robot Dog


As more robots start showing up in warehouses, offices, and even people’s homes, the idea of large language models hacking into complex systems sounds like the stuff of sci-fi nightmares. So, naturally, Anthropic researchers were eager to see what would happen if Claude tried taking control of a robot—in this case, a robot dog.

In a new study, Anthropic researchers found that Claude was able to automate much of the work involved in programming a robot and getting it to do physical tasks. On one level, their findings show the agentic coding abilities of modern AI models. On another, they hint at how these systems may start to extend into the physical realm as models master more aspects of coding and get better at interacting with software—and physical objects as well.

“We have the suspicion that the next step for AI models is to start reaching out into the world and affecting the world more broadly,” Logan Graham, a member of Anthropic’s red team, which studies models for potential risks, tells WIRED. “This will really require models to interface more with robots.”

Courtesy of Anthropic

Courtesy of Anthropic

Anthropic was founded in 2021 by former OpenAI staffers who believed that AI might become problematic—even dangerous—as it advances. Today’s models are not smart enough to take full control of a robot, Graham says, but future models might be. He says that studying how people leverage LLMs to program robots could help the industry prepare for the idea of “models eventually self-embodying,” referring to the idea that AI may someday operate physical systems.

It is still unclear why an AI model would decide to take control of a robot—let alone do something malevolent with it. But speculating about the worst-case scenario is part of Anthropic’s brand, and it helps position the company as a key player in the responsible AI movement.

In the experiment, dubbed Project Fetch, Anthropic asked two groups of researchers without previous robotics experience to take control of a robot dog, the Unitree Go2 quadruped, and program it to do specific activities. The teams were given access to a controller, then asked to complete increasingly complex tasks. One group was using Claude’s coding model—the other was writing code without AI assistance. The group using Claude was able to complete some—though not all—tasks faster than the human-only programming group. For example, it was able to get the robot to walk around and find a beach ball, something that the human-only group could not figure out.

Anthropic also studied the collaboration dynamics in both teams by recording and analyzing their interactions. They found that the group without access to Claude exhibited more negative sentiments and confusion. This might be because Claude made it quicker to connect to the robot and coded an easier-to-use interface.

Courtesy of Anthropic

The Go2 robot used in Anthropic’s experiments costs $16,900—relatively cheap, by robot standards. It is typically deployed in industries like construction and manufacturing to perform remote inspections and security patrols. The robot is able to walk autonomously but generally relies on high-level software commands or a person operating a controller. Go2 is made by Unitree, which is based in Hangzhou, China. Its AI systems are currently the most popular on the market, according to a recent report by SemiAnalysis.

The large language models that power ChatGPT and other clever chatbots typically generate text or images in response to a prompt. More recently, these systems have become adept at generating code and operating software—turning them into agents rather than just text-generators.



Source link

Continue Reading

Tech

The AI Boom Is Fueling a Need for Speed in Chip Networking

Published

on

The AI Boom Is Fueling a Need for Speed in Chip Networking


The new era of Silicon Valley runs on networking—and not the kind you find on LinkedIn.

As the tech industry funnels billions into AI data centers, chip makers both big and small are ramping up innovation around the technology that connects chips to other chips, and server racks to other server racks.

Networking technology has been around since the dawn of the computer, critically connecting mainframes so they can share data. In the world of semiconductors, networking plays a part at almost every level of the stack—from the interconnect between transistors on the chip itself, to the external connections made between boxes or racks of chips.

Chip giants like Nvidia, Broadcom, and Marvell already have well-established networking bona fides. But in the AI boom, some companies are seeking new networking approaches that help them speed up the massive amounts of digital information flowing through data centers. This is where deep-tech startups like Lightmatter, Celestial AI, and PsiQuantum, which use optical technology to accelerate high-speed computing, come in.

Optical technology, or photonics, is having a coming-of-age moment. The technology was considered “lame, expensive, and marginally useful,” for 25 years until the AI boom reignited interest in it, according to PsiQuantum cofounder and chief scientific officer Pete Shadbolt. (Shadbolt appeared on a panel last week that WIRED cohosted.)

Some venture capitalists and institutional investors, hoping to catch the next wave of chip innovation or at least find a suitable acquisition target, are funneling billions into startups like these that have found new ways to speed up data throughput. They believe that traditional interconnect technology, which relies on electrons, simply can’t keep pace with the growing need for high-bandwidth AI workloads.

“If you look back historically, networking was really boring to cover, because it was switching packets of bits,” says Ben Bajarin, a longtime tech analyst who serves as CEO of the research firm Creative Strategies. “Now, because of AI, it’s having to move fairly robust workloads, and that’s why you’re seeing innovation around speed.”

Big Chip Energy

Bajarin and others give credit to Nvidia for being prescient about the importance of networking when it made two key acquisitions in the technology years ago. In 2020, Nvidia spent nearly $7 billion to acquire the Israeli firm Mellanox Technologies, which makes high-speed networking solutions for servers and data centers. Shortly after, Nvidia purchased Cumulus Networks, to power its Linux-based software system for computer networking. This was a turning point for Nvidia, which rightly wagered that the GPU and its parallel-computing capabilities would become much more powerful when clustered with other GPUs and put in data centers.

While Nvidia dominates in vertically-integrated GPU stacks, Broadcom has become a key player in custom chip accelerators and high-speed networking technology. The $1.7 trillion company works closely with Google, Meta, and more recently, OpenAI, on chips for data centers. It’s also at the forefront of silicon photonics. And last month, Reuters reported that Broadcom is readying a new networking chip called Thor Ultra, designed to provide a “critical link between an AI system and the rest of the data center.”

On its earnings call last week, semiconductor design giant ARM announced plans to acquire the networking company DreamBig for $265 million. DreamBig makes AI chiplets—small, modular circuits designed to be packaged together in larger chip systems—in partnership with Samsung. The startup has “interesting intellectual property … which [is] very key for scale-up and scale-out networking” said ARM CEO Rene Haas on the earnings call. (This means connecting components and sending data up and down a single chip cluster, as well as connecting racks of chips with other racks.)

Light On

Lightmatter CEO Nick Harris has pointed out that the amount of computing power that AI requires now doubles every three months—much faster than Moore’s Law dictates. Computer chips are getting bigger and bigger. “Whenever you’re at the state of the art of the biggest chips you can build, all performance after that comes from linking the chips together,” Harris says.

His company’s approach is cutting-edge and doesn’t rely on traditional networking technology. Lightmatter builds silicon photonics that link chips together. It claims to make the world’s fastest photonic engine for AI chips, essentially a 3D stack of silicon connected by light-based interconnect technology. The startup has raised more than $500 million over the past two years from investors like GV and T. Rowe Price. Last year, its valuation reached $4.4 billion.



Source link

Continue Reading

Trending