Connect with us

Tech

SuperMicro takes on server leaders as AMD pushes on-premise AI | Computer Weekly

Published

on

SuperMicro takes on server leaders as AMD pushes on-premise AI | Computer Weekly


Market data from analyst IDC has shown that SuperMicro has leapfrogged established server makers Lenovo and HPE as the second-largest PC server maker behind Dell.

SuperMicro experienced growth of almost 134% for the fourth quarter of 2025 with revenue of $11.7bn, which means it accounts for over 9% of the global server market. Dell was ahead with 10% market share and revenue of $12.6bn, while Chinese manufacturer IEIT Systems took the third spot, with revenue of $5.2bn and a 4% market share ahead of Lenovo, which posted revenue of $5.1bn, and HPE ($3.9bn).

“The race for AI [artificial intelligence] adoption is settling the market pace, and with companies starving for infrastructure looking not only at GPUs [graphics processing units], but also consuming more CPUs [central processing units] among other components in order to feed their needs, we are going to see more price pressures, and that may impact on market dynamics with less units but higher average selling prices going forward,” said Juan Seminara, research director of Worldwide Enterprise Infrastructure Trackers at IDC.

IDC noted that volatile increasing prices on certain components such as GPUs, dynamic random access memory (DRAM) and solid state drives (SSDs) has meant that some companies have been trying to secure prices ahead while the industry is accommodating to the new reality. It predicted that the impact of this price volatility could be hitting harder during 2026 as demand keeps outpacing service capacity in the near term.

Besides Dell, the established server makers seem to be losing ground in the server market. But they appear to be looking at a new market opportunity being pushed by chipmaker AMD, which is the deployment of on-premise PC servers optimised to run agentic AI. 

In a bid to entice IT buyers away from cloud-based AI hardware, AMD has unveiled what it sees as a new category of PC called Agent Computers. In a post on the AMD website, the company described how to run OpenClaw, the open source AI agent, locally on AMD Ryzen AI Max+ processors and Radeon GPUs using a Windows 11 PC with the Windows Subsystem for Linux (WSL).

AMD said the PC system configured with 128GB unified memory is capable of running “cloud-quality AI agent workloads efficiently” using OpenClaw. According to its own benchmark data, with the Qwen 3.5 35B A3B model, the system delivers around 45 tokens per second and processes 10,000 input tokens in about 19.5 seconds. AMD said the configuration supports a maximum context window of 260,000 tokens, and can run up to six agents concurrently, which it said means it is able to deliver scalable local AI experimentation while maintaining strong responsiveness on consumer hardware.

AMD sees such a system running autonomously rather like the pre-cloud era branch office servers, handling tasks sent by users through a browser user interface on another Windows PC, or via Slack or WhatsApp.

PC makers that have “agent-ready” PCs include HP, Lenovo and Asus. The IDC figures show that revenue for servers with an embedded GPU in the fourth quarter of 2025 grew 59.1% year-over-year, representing more than half of the total server market revenue.

The AMD Ryzen AI Max+ has an integrated GPU, and is currently one of the processor options for PCs certified as Copilot+ devices. While these devices are either laptops or desktop PCs with monitors, AMD’s Agent Computer appears to be positioned as more of a traditional desktop Windows PC running as a server, without a screen or keyboard. The setup AMD provides is optimised to run LM Studio. This uses Ubuntu on the WSL to provide access to large language models, which then work with an OpenClaw server running locally on the same hardware.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

COBOL Is the Asbestos of Programming Languages

Published

on

COBOL Is the Asbestos of Programming Languages


Early in the Covid-19 pandemic, the governor of New Jersey made an unusual admission: He’d run out of COBOL developers. The state’s unemployment insurance systems were written in the 60-year-old programming language and needed to be updated to handle the hundreds of thousands of claims. Trouble was, few of the state’s employees knew how to do that. And the crisis went beyond New Jersey, just one of many states that depended on these unwieldy systems. By one rough calculation, COBOL’s inefficiencies cost the US GDP $105 billion in 2020.

You might think New Jersey would have replaced its system after this—and that Covid was COBOL’s last gasp. Not quite. The state’s new unemployment system came with a number of quality-of-life improvements, but on the backend, it was still made possible by a mainframe running the ancient language.

COBOL, short for Common Business-Oriented Language, is the most widely adopted computer language in history. Of the 300 billion lines of code that had been written by the year 2000, 80 percent of them were in COBOL. It’s still in widespread use and supports a large number of government systems, such as motor vehicle records and unemployment insurance; on any given day, it can handle something on the order of 3 trillion dollars’ worth of financial transactions. I think of COBOL as a kind of digital asbestos, almost ubiquitous once upon a time and now incredibly, dangerously difficult to remove.

COBOL was first proposed in 1959 by a committee comprising most of the US computer industry (including Grace Hopper). It called for “specifications for a common business language for automatic digital computers” to solve a growing problem: the expense of programming. Programs were custom-written for specific machines, and if you wanted to run them on something else, that meant a near-total rewrite. The committee approached the Department of Defense, which happily embraced the project.

COBOL’s design set it apart from other languages both then and now. It was meant to be written in plain English so that anybody, even nonprogrammers, would be able to use it; symbolic mathematical notation was added only after considerable debate. Most versions of COBOL allow for the use of hundreds of words (Java permits just 68), including “is, “then,” and “to,” to make it easier to write in. Some have even said COBOL was intended to replace computer programmers, who in the 1960s occupied a rarified place at many companies. They were masters of a technology that most people could barely comprehend. COBOL’s designers also hoped that it would generate its own documentation, saving developers time and making it easy to maintain in the long run.

But what did it even mean to be readable? Programs aren’t books or articles; they’re conditional sets of instructions. While COBOL could distill the complexity of a single line of code into something anybody could understand, that distinction fell apart in programs that ran to thousands of lines. (It’s like an Ikea assembly manual: Any given step is easy, but somehow the thing still doesn’t come together.) Moreover, COBOL was implemented with a piece of logic that grew to be despised: the GO TO statement, an unconditional branching mechanism that sent you rocketing from one section of a program to another. The result was “spaghetti code,” as developers like to say, that made self-documenting beside the point.

Plenty of computer scientists had issues with COBOL from the outset. Edsger Dijkstra famously loathed it, saying, “The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.” Dijkstra likewise hated the GO TO statement, arguing that it made programs nearly impossible to understand. There was a degree of real snobbishness: COBOL was often looked down on as a purely utilitarian language that was intended to solve boring problems.

Jean Sammet, one of the original designers, saw it differently—the language simply had the complicated task of representing complicated things, like social security. Or as another defender wrote, “Regrettably, there are too many such business application programs written by programmers that have never had the benefit of structured COBOL taught well.” Good COBOL was indeed self-documenting, but so much depended on the specific programmer. Fred Gruenberger, a mathematician with the Rand Corporation, put it this way: “COBOL, in the hands of a master, is a beautiful tool—a very powerful tool. COBOL, as it’s going to be handled by a low-grade clerk somewhere, will be a miserable mess.”



Source link

Continue Reading

Tech

The Tesla Influencers Leaving the ‘Cult’

Published

on

The Tesla Influencers Leaving the ‘Cult’


She thinks some of these individuals will never stop running cover for the company because of their long-term investments. “To me it’s a lot about the money, more about the money than it is Elon—even though they say it’s Elon,” she says.

No one, however, provokes the wrath of the Tesla swarm like Dan O’Dowd.

A tech billionaire who founded Green Hills Software and serves as its CEO, he, too, was once a great proponent of Tesla vehicles and Musk’s leadership. In 2016, he owned two Roadsters and a Model S. “Big fan,” he says. That year, he was thrilled to hear Musk proclaim that a Tesla would autonomously drive itself across the US from Los Angeles to Times Square in Manhattan by the end of 2017.

“He wanted people to believe that, but there was no truth to it at all,” says O’Dowd. At that time, he still argued that Musk was a “genius.” But as the 2017 deadline went by and Musk stopped bothering to offer new time frames for the cross-country drive, O’Dowd wondered if it would ever happen. He now believes that “nothing worked at that point.”

O’Dowd also began to notice that Tesla would make splashy announcements for new products with amazing specs—like a souped-up edition of the Roadster and a line of Tesla semi trucks—that were then indefinitely delayed.

He felt Tesla was losing sight of its most important objective: a more affordable base model EV. The company scrapped plans for a long-awaited vehicle with a price target of $25,000 in 2024, and in January of this year, Musk announced that Tesla would stop producing the Model X and Model S, two flagship products, to focus on building its Optimus humanoid robots.

By 2020, people were sending O’Dowd videos demonstrating Tesla’s beta version of FSD. “I said, ‘Wait a minute, this thing is failing way too much.’ Like, this isn’t close to being done,” he says, despite Musk’s claims that it was almost perfected. O’Dowd and his team began downloading every available Tesla FSD video to analyze its malfunctions.

In 2021 he founded the Dawn Project, an organization that lobbies against the implementation of “defective and insecure software” in infrastructure and safety-critical systems. Its first and still primary campaign is aimed at shutting down FSD. The Dawn Project has warned of the dangers of the software in an ad published in The New York Times and commercials that ran during the Super Bowl broadcasts in 2023 and 2024, which showed self-driving Teslas breezing past stopped school buses and striking child-sized mannequins in pedestrian crossings.

These videos have never convinced the FSD evangelists of anything, O’Dowd says, because no matter how the tests are conceived and filmed, the Dawn Project is accused of faking everything. The Teslarati smear O’Dowd himself as a bad-faith actor. “They say ‘He’s in the pay of the oil companies, he works for Waymo, he hates Tesla,’” O’Dowd says. In response to the Times ad, Tesla faithful Omar Qazi, who goes by the handle @WholeMars on X, published a lengthy blog post that accused O’Dowd of having “blood on his hands” because Green Hill is a defense contractor. O’Dowd expected no less when he launched the Dawn Project. “I knew what had happened to the people who had called out Tesla before,” he says. Harassment and abuse come with the territory.



Source link

Continue Reading

Tech

UK government unveils gigabit broadband upgrade tracker | Computer Weekly

Published

on

UK government unveils gigabit broadband upgrade tracker | Computer Weekly


As the steady pace of improvement continues in the UK’s national fixed broadband infrastructure, the UK government has launched an online address checker allowing businesses to see whether they are due to receive a gigabit broadband upgrade, especially giving rural communities clearer visibility over roll-out plans for faster connectivity.

The launch comes as the UK government aims to accelerate broadband roll-out in harder-to-reach areas, claiming more than 750 homes and businesses are now gaining access to gigabit-capable broadband each day through the Project Gigabit scheme.

The £5bn Project Gigabit programme was introduced in 2021 with the aim of accelerating the UK’s recovery from Covid-19, boosting high-growth sectors such as tech and the creative industries, and levelling up the country by spreading wealth and creating jobs through offering access to gigabit broadband across the UK.

At its launch, the previous UK government said the scheme would prioritise areas with slow connections that would otherwise be left behind in commercial broadband companies’ plans, as well as give rural communities access to the fastest internet on the market, helping to grow the economy.

Project Gigabit specifically targets places typically regarded as too expensive for commercial providers to reach in their build and which would otherwise be left with poor digital infrastructure. It was designed from the outset to help meet the growing demand for reliable connectivity, stimulating local rural economies and reducing regional disparities by enabling remote working and attracting new businesses.  

One of the first acts by the new Labour administration that was elected in July 2024 was to reconfirm the original objectives to build a broadband infrastructure that would see 85% of the UK have gigabit-capable connectivity by the end of 2025 and full nationwide coverage by 2030.

A month later, the UK government announced that it was investing up to £800m to modernise broadband infrastructure in rural areas of England, Scotland and Wales and hit the Project Gigabit in a deployment contract with leading UK broadband provider Openreach.

Explaining in May 2025 why it was ramping up the broadband access scheme, the UK government said hundreds of thousands of rural homes and businesses were still struggling to fulfil basic online tasks due to outdated infrastructure, making it necessary to obtain major internet speed upgrades and narrow the existing digital divide.

The upgrade plan is expected to drive productivity gains, support more than 620,000 people back into the workforce and enable more than one million to work from home, contributing an additional £19bn annually. Openreach noted in May 2025 that research by the Centre for Economics and Business Research (CEBR) shows that full-fibre broadband could deliver a £66bn boost to the UK economy by 2029.

The tracker service allows users to enter their postcode to see if their property is included in the Project Gigabit programme or in commercial fibre deployments. The government said improved connectivity will help rural communities access digital services, support remote working and boost local economic growth. Faster broadband is also expected to support sectors such as agriculture, tourism and small businesses in remote areas.

Commenting on the launch of the new service, Jennifer Holmes, CEO of the London Internet Exchange, said: “The continued roll-out of gigabit-capable broadband and improved mobile coverage is an important step in strengthening the UK’s digital infrastructure. As demand for online services continues to grow, the networks that underpin the internet must be resilient, efficient and capable of supporting increasing volumes of data.

“Strong infrastructure is essential not only for everyday connectivity, but also for supporting innovation, economic growth and the UK’s wider digital ambitions. Investment in faster and more reliable connectivity will help ensure that businesses, public services and communities can fully participate in an increasingly digital economy.”

Elizabeth Anderson, CEO of the UK’s  Digital Poverty Alliance, added: “The  roll-out … is a welcome step towards closing long-standing connectivity gaps across the UK. However, infrastructure alone will not solve digital poverty. Around 19 million people in the UK experience some form of digital exclusion, and government figures show that around 1.6 million people are still living entirely offline.

“We estimate around two million people lack connectivity due to affordability and gigabit broadband is frequently out of reach due to higher costs. While faster networks are important, they only make a difference if people can afford to use them.”



Source link

Continue Reading

Trending