While artificial intelligence (AI) was certainly the top topic of discussion during Forrester’s Technology & Innovation Summit, the conversation appears to have moved on. In preparation for the era of agentic AI, organisations are starting to consider where employees fit; where to use contractors and external service providers; and what tasks should be AI-enabled internally.
In a blog post to tie in with the event, Forrester research director Mark Moccia wrote about how a third of CIOs will adopt “gig worker protocols”, where IT teams comprise AI agents, gig workers and employees with multiple jobs.
In his keynote presentation at the Technology & Innovation Summit, Manuel Geitz, Forrester principal analyst, discussed how business and technology leaders should prepare for this shift. “You start by really understanding which expertise you need to drive your business model,” he told attendees.
For Geitz, IT leaders can get their organisations ready for workflows that may be split between internal staff, external contractors and AI agents by capturing the knowledge using structured data ontologies, to make expertise machine readable. He suggested delegates can then begin to experiment with business models that monetise this expertise on demand, eventually building a platform, where AI agents become the front line for knowledge delivery, supported by humans.
The idea of a gig economy using AI agents is something that appears to be gaining traction among industry commentators. In a recent conversation with Computer Weekly, Jessica Apotheker, managing director and chief marketing officer at Boston Consulting Group (BCG), discussed how the marketing function – which tends to draw on both external and internal expertise – could evolve with an agentic AI workflow. As an example, she discussed the content production workflow.
“There’s a tonne of external people working in the content work group,” she said. “There’s creative agencies, production agencies, localisation agencies. There’s internal people and local marketers, and there’s the tech people. All these people need to come together and reinvent.”
According to Apotheker, this is because AI has the potential to change the content workflow process. IT and business decision-makers need to reconsider what parts of the process they want to own and what parts can be automated, or should be outsourced to a service provider who may well use AI and automation to complete the work: “What is the part of the workflow I think I need to strategically own and transform, and how will that connect with what I actually outsource or potentially automate myself?”
Putting a price on value
Research from BCG suggests organisations that are seeing significant business benefits from deploying AI tend to be AI-first, which means business leaders reconsider the role people have in a business process or workflow, where some aspects can and will be automated with AI.
“Think of an AI-first workflow,” said Apotheker. “You need to rethink what you make and what you buy. It is not obvious that your current make or buy strategy is the one that you need. You just want your contractor to do the automation on their piece of the workflow.”
In a recent podcast, Prem Ananthakrishnan, global software practice lead at Accenture, discussed how the use of AI and agentic AI in business processes is shifting how people think about software.
“There is a fundamental change from understanding that software cannot just be purchased as a tool, to thinking about software as a collaborator that’s driving an outcome for the business,” he said.
This is the next shift in software licensing, one that moves purchases of technological capabilities beyond consumption-based pricing. Mirroring the remarks of the Forrester analysts and BCG’s Apotheker, Ananthakrishnan said: “We still think of buying software as procuring a tool. We need to think about procuring a collaboration vehicle. In my view, IT buyers need to evolve from thinking about procurement to performance and design thinking. Don’t think about buying software anymore. Think about how you’re hiring digital teammates.”
Ananthakrishnan believes these digital teammates will be paid based on outcomes, using what he terms “value-based pricing”.
This is a huge mindset shift, but business and IT leaders can start with something they already have a grasp of: business process outsourcing (BPO) – evaluating which parts of the process are strategic should remain in-house. In the conversations Apotheker has had with organisations that are considering an extreme makeover of their workflows and business processes, she said: “Either you take a BPO approach and fully outsource to somebody, hope that they will transform the process with AI and incentivise on outcomes, or you reshape the process internally.”
For now, Accenture’s Ananthakrishnan noted that token-based pricing and AI credits, which are often applied when purchasing AI-based services, are proxies for value. The more an AI service is used, the more tokens are needed and the more credits are consumed. He said these consumption-based pricing models provide a bridge to leading to an outcome-based pricing model where organisations hire AI agents to take on work.
Ananthakrishnan recommended that IT leaders start implementing business impact metrics, such as linking return on investment to an AI credit model. They might also consider a hybrid model priced on an upfront AI credit, where the supplier is paid a bonus if a certain outcome is achieved.
There is plenty to consider as working practices adapt to include agentic AI – but irrespective of whether AI-enhanced work is achieved internally or via an external service provider, value-based pricing is coming, and people in IT leadership and procurement will need to assess how risk versus return changes when the product or service that is being procured is a probabilistic environment rather than a very deterministic environment.
When I started hiking, big leather boots were the only real option. They were burly, stiff, and difficult to break in, but one pair would last you decades. Technology has mercifully caught up, however. If you head to the trails today, most hikers and backpackers are opting for more lightweight, low-cut options. While an influx of new shoes from brands like Hoka, Merrell, Danner, and Salomon has transformed the footwear industry, that doesn’t mean the hiking boot has had its day. It just depends on what you’re looking to do and when you’re doing it.
Which shoes should you pick to go out for the day? I tested countless pairs of great hiking boots, trail runners, and hiking shoes across a variety of terrain, from forest trails and coastal paths to high alpine terrain. To get a better understanding of the differences between the many options available—and which is right for you—I grilled Ingrid Johnson, a leading footwear product specialist at REI. (For what it’s worth, Johnson’s personal recommendation is the Salomon XA Pro).
Update March 2026: We added links to recent coverage, added the On Running Cloudrock Low, and updated links and prices.
Jump to Section
Here’s When You Need Boots
If you’re carrying a heavy pack over rough terrain, or if it’s wet or snowy, you need hiking boots. They tend to be higher at the ankle, with stiff midsoles and protective toe caps, and they are generally made from very durable materials like leather and tough synthetic fabrics like Cordura. Hiking boots prioritize stability, protection, and durability.
Boots generally have thick, deep lugs, tougher soles, stronger toe guards, and sturdier ankle support. They protect you from rock impact, uneven ground, moisture, and often colder conditions. The high-cut designs also offer more ankle support, something I found reassuring when coming back from a recent injury.
But don’t think that hiking boot brands are stuck in the dark ages. Borrowing lightweight features and materials from trail running, brands are able to offer technical boots with cushioning, grip, and stability. They’re still heavy, but featherweight compared to a traditional leather boot. Hoka’s Kaha 3 GTX ($240) is one of the best boots available, blending soft nubuck leather, Vibram Megagrip sole, and bags of cushioning. Here are a few other picks:
Perennially popular for good reason, these Salomons boast superb levels of comfort and support without the bulk typically associated with traditional walking boots. They feel like ski boots, but that’s not a criticism; the height and support is most welcome when walking all day carrying a full pack.
Despite being declared the third-hottest year on record, 2025 was a relatively quiet year for climate disasters in the US. No major hurricanes made landfall, while the total number of acres burned in wildfires last year—a way of measuring the intensity of wildfire season—fell below the 10-year average.
But starting this week, the West is experiencing what looks to be a record-breaking heat wave, while forecasting models predict that a strong El Niño event is likely to emerge later this year. These two unrelated phenomena could set the stage for a long stretch of unpredictable and extreme weather reaching into next year, compounding the effects of a climate that’s getting hotter and hotter thanks to human activity.
First, there’s the heat. Beginning this week and heading into next, a massive ridge of high-pressure air will bring record-breaking temperatures to the American West. The National Weather Service predicts that temperature records across multiple states are set to be broken in dozens of locations, stretching as far east as Missouri and Tennessee. The NWS has issued heat warnings for parts of California, Arizona, and Nevada, as well as fire warnings for parts of Wyoming, Nebraska, South Dakota, and Colorado.
“This will be the single strongest ridge we’ve observed outside of summer in any month,” says Daniel Swain, a climate scientist at the University of California Agriculture and Natural Resources.
The other remarkable thing about this heat wave, Swain says, is just how long it’s going to last. “This is not a day or two of extreme heat,” he says. “We’ve already in some of these places been seeing record highs every day for a week, and we expect to see them every day for another at least seven to 10 days.” The later end of March will be much more intense, with temperatures in some places breaking April and May records. “There aren’t that many weather patterns that can result in an 85- or 90-degree temperature in San Francisco, Salt Lake City, and Denver in the same week.”
This late winter heat wave is adding on to an already warm winter in the West—with big implications for the summer. A month ago, snowpack levels across multiple states were at record lows thanks to warmer-than-average temperatures. According to data provided by the Department of Agriculture, snowpack levels were still sitting below 50 percent of average across many Western states. Snowpack is a critical natural reservoir for rivers in the West; between 60 to 70 percent of the region’s water supply in many areas comes from melting snow. Low snowpack is a bad sign for already-stressed rivers like the Colorado, which supplies water for 40 million people in seven states.
The ongoing heat wave, Swain says, will more than likely make conditions even worse. “April 1st is typically the point at which snowpack would be, at least historically, at its peak,” he says. Even if temperatures cool off until summer, these low snowpack levels are also a worrisome sign for the upcoming fire season. Snow droughts like the one the West is experiencing can dry out soil, kill trees, and lessen stream flow: ideal conditions for a wildfire to grow. Meanwhile, the water supply in the Colorado River could drop even lower. States that rely on the river are already facing a political crisis as they attempt to renegotiate water rights; a drought would only up the ante.
Then there’s El Niño. Last week, the National Weather Service announced that there was more than a 60 percent chance of an El Niño event emerging in August or September. Various weather models suggest that this El Niño could be particularly strong. While we likely won’t know for sure until summer, “the fact that [all the models] are moving upwards is worth watching,” says Zeke Hausfather, a research scientist at Berkeley Earth.
The UK government recently unveiled its UK fusion strategy 2026, which includes £125m of funding to develop the artificial intelligence (AI) growth zone at Culham, Oxfordshire. This includes a £45m investment in “Sunrise”, the new fusion-dedicated supercomputer.
One area in which Sunrise will be used is accelerating simulation, surrogates and design, where AI could simplify simulations or learn the behaviour of complex systems such as plasmas to speed up simulations that previously took weeks or months to run.
It will also be used for data management, making the UK Atomic Energy Authority’s (UKAEA) fusion research and experimental data consistent, accessible and electronically readable. In addition, Sunrise offers the UKAEA – an executive non-departmental public body, sponsored by the Department for Energy Security and Net Zero – the ability to enhance experimental operations and control in real-time diagnostics, where AI can be trained to spot anomalies and flag issues.
The role of high-performance computing (HPC) AI acceleration hardware within the government’s strategy for nuclear fusion is to prepare fusion data for AI applications to ensure that researchers from small and medium-sized enterprises (SMEs) and academic institutions can access data, supporting greater collaboration and engagement with industry partners.
The 6.76 exaflops Sunrise AI supercomputer involves a collaboration between AMD, DESNZ, the Department for Science, Innovation and Technology (DSIT), Dell Technologies, Intel, UKAEA, the University of Cambridge, and Weka, a data platform provider.
Looking at its headline performance data, Rob Akers, UKAEA’s director for computing programmes, says: “It’s very challenging to define how powerful a piece of hardware like Sunrise is, because it depends on your metric for success.”
Sunrise offers the full spectrum of floating point precisions, from 8-bit right the way up to 64-bit precision, but, as Akers points out, each one of those targets a different part of the problem. “The important thing for us is that we can’t forego 64-bit precision, because that’s what’s going to feed the artificial intelligence algorithms that we’ll be applying when using Sunrise as an engineering tool,” he says.
“Sunrise is not just a very powerful laptop – it is a very complex piece of machinery that we’ll be putting to the task of solving a very large set of complex problems”
Rob Akers, UKAEA
AI makes it possible to collapse high-fidelity models that need very high bit precision down into what UKAEA calls “surrogate” models, according to Akers, who adds that these surrogates can run on a workstation or a laptop in a tiny fraction of the time it would take the big solvers running on large supercomputers.
“It’s almost like an instrument for discovery,” he adds. “Sunrise is not like a laptop. It’s not just a very powerful laptop – it is a very complex piece of machinery that we’ll be putting to the task of solving a very large set of complex problems.”
One of the interesting numbers that pop up in the specification for Sunrise is the figure for 8-bit precision, especially given that 8-bit computing harks back to the era of the home computer some 50 years ago.
“The interesting thing is that 8-bit precision has become an incredibly powerful part of the computing landscape now because of large language models [LLMs],” says Akers.
Running LLMs is in the UKAEA’s plans. “We are going to be doing work in that space, building very bespoke models that will ingest text document archives that have been collected over many, many decades, and turning that into useful information and knowledge,” he says.
Digital twins
Akers says this information will be put together with the Mega Amp Spherical Tokamak (MAST) experimental data run at Culham. “Working out how to achieve this needs the full spectrum of precision,” he says.
Although 8-bit precision is the domain of the LLMs that need to process tokens as quickly as possible to understand volumes of textural information, Akers says 64-bit precision is the realm of high-fidelity simulation, which needs to achieve a high degree of accuracy. “Because of the way we run models forward in time, we can’t allow them to drift. They need to preserve certain physical quantities to ensure the simulations are meaningful,” he says.
Sunrise will allow us to take on a moonshot-like problem, a lot more cost-effectively, to reduce risk and accelerate the time to deliver commercial fusion Rob Akers, UKAEA
So, while floating point precision is regarded as a metric for comparisons against other AI machines, for Akers, it is not necessarily the best metric to measure the outright performance of an AI scientific machine. What is needed, he says, is “the ability to simulate very high-fidelity, strongly coupled models”.
This is due to the sheer complexity of a machine that aims to mimic the way the sun generates its power. “In a nuclear fusion power plant, there are lots of different physical mechanisms that couple the plant together – everything from structural forces due to gravity, but also due to electromagnetism. Then there’s the heat flow and radiation flow across the system. Everything’s coupled together,” says Akers.
Historically, UKAEA has not been able to simulate this environment at scale. “What we worry about is the black swans or emergent behaviour that is a result of that coupling,” he adds.
Akers says digital twins running on Sunrise will be able to model these very complex systems, which can then be compared with the results of experiments. “We are able to tune up our ability to step forward in time or step outside where we’ve been before, or indeed to create new pieces of machinery that we’ve never seen before, and take a giant leap where we have confidence in having nailed down the known unknowns into the simulations,” he says.
“Test-based design is expensive, and it’s slow,” Akers adds. The goal is to use Sunrise to reduce the amount of test-based design that UKAEA has to do. “It will allow us to take on a moonshot-like problem, a lot more cost-effectively, to reduce risk and accelerate the time to deliver commercial fusion.”