Over the past year, Kate Balingit has been leading the digital health initiative at Mars Nutrition, reporting to the company’s pet care chief information officer, where she is focused on commercialising and deploying artificial intelligence (AI) through the Mars pet nutrition brands. These include well-known pet foods brands such as Pedigreee, Iams, Royal Canine, Sheba and Whiskas.
“Even though we’re building tech products, Mars is a non-tech company,” says Balingit, whose official job title is Mars Petcare head of digital innovation. “We kind of abide by the same standards of scientific credibility and scientific rigour that apply to our primary business of food.”
A former Googler, who was also involved in Waze, Balingit joined Mars Petcare in 2022 to head up Whistle, the “FitBit for dogs” company Mars acquired in 2016 (see Career at Google) .
She says Mars Petcare has made a large commitment to digitising the pet care business. This includes everything from upskilling staff to digitising factories and its supply chain, as well as elevating the e-commerce experiences. Digitisation also covers emerging technologies such as using agentic AI for automating workflows and mining digital health data.
On the AI front, rather than rely on existing large language models (LLMs), she says the business is focused on building the computer vision algorithms itself: “We’re building image classifiers to detect signs of emerging health conditions and enterprise software components that enable us to create user experiences that can safely live on our brand digital properties. It comes down to differentiated assets – our proprietary data sets bootstrap an image database and then we work with vets to label the images and train the algorithm.”
She says these algorithms go through the same kind of scientific governance rigor as the food part of the business. “We do have to be able to say where we sourced our data. We’re also very explicit about publishing how we train the models.” This, she says, is a differentiator. “You don’t get a free pass just because you’re working with algorithms. At a non-tech company, you have to abide by the same quality standards that apply to the entire business.”
Among the challenges the company aims to address is how to build products and digital experiences that meet the unique needs of individual brands, individual business units and offers a unique differentiator. A lot of the work involves its data architecture for structuring all of the data that the company collects from pet parents who use the apps and applications the company develops.
“We’re working with emerging technologies like computer vision and trying to build products with a platform approach to enable us to repurpose these assets in different types of applications,” she says. “My team takes a very component-based approach. I don’t see us building products. Instead, we are building a series of capabilities.”
Digitising pet care
There are around 200 people working in the digital transformation organisation at Mars Petcare. Balingit’s role involves orchestrating initiatives across three core functions: science, data science and software engineering.
“The digital health initiative starts with science; we’re building scientific instruments,” she says. These algorithms are capable of detecting the emerging presence of health conditions in dogs. “I start by partnering with the global R&D [research and development] science function, which includes specialists in oral health, skin health, gut health and healthy ageing.”
The team put together a specification for the product, such as deciding on the symptoms of a health condition that the software and AI it produces will be able to detect. The data science team is used to build the algorithm to detect the health condition.
“In the case of a canine dental check, we’re detecting plaque, tartar and gum irritation. I work with our data science team to build the algorithm – we have to acquire the training data and we have to label it, then we build the computer vision models using Azure developer tools.”
The algorithm is made available via an application programming interface (API). Balingit then works with the software engineering team on the actual product experience. “It’s a truly cross-functional effort,” she says.
The software not only needs to meet the high standards associated with the brand, but a high bar is also set for the enterprise architecture, data security and data privacy. With these high standards, Balingit says: “Data science and software engineering can do something really special, which is to scale scientific understanding and put these capabilities into the hands of pet parents around the world through our biggest brands.”
Greenies is a recent example of one of the brands with an AI tool. “Our use of AI in the Greenies Canine Dental Check tool started with a pet parent insight. We know that 80% of dogs have signs of periodontal disease by the age of three, but 72% of pet parents think that their dog’s oral health is fine,” she says.
The team wanted to address this awareness gap among pet owners using AI to, as Balingit puts it, “make the invisible visible and help people to understand that their dog is experiencing an oral health issue.”
“We’re very explicit about publishing how we train the models. You don’t get a free pass just because you’re working with algorithms”
Kate Balingit, Mars Petcare
The Greenies Canine Dental Check required a computer vision algorithm trained on more than 50,000 images of dogs. “We built an algorithm that was capable of taking a smartphone image to understand if the photograph is of a dog and, if it is, if it’s showing the dog’s mouth and its teeth are visible.” The algorithm then needs to analyse the image to determine whether the tooth has visual signs of oral disease.
When asked about the success in capturing teeth in a pet dog’s mouth, she says: “We always encourage caution. But when I’ve looked at the data, the average user captures about 10.2 teeth in the photo itself.” So, while it may seem a major undertaking for pet owners to attempt taking smartphone photos of their dog’s mouth with visible teeth, in Balingit’s experience, pet parents are “very capable”.
Another consideration is the level of accuracy. Balingit says: “No algorithm is going to be 100% accurate. A human is not 100% accurate. What’s really important is that we are not building a diagnostic device. Our goal was to build a health-screening instrument that could find visual indicators of an emerging disease.” As such, the level of accuracy it can achieve of 97% is good enough.
An approach to business AI
As Balingit notes: “AI is just top of mind for everybody right now.” Like many businesses deploying AI applications, she points out that the past two years have been “a whirlwind”, which means companies such as Mars Petcare need to figure out what they should be doing with AI.
“It’s important to be intentional about what we’re doing, and the key question for me is, ‘What do we at Mars Petcare have that an AI company in Silicon Valley doesn’t have? What are our unique assets and how do we build an AI innovation agenda on top of them?’”
Looking to the future and advances in digital technologies, Balingit believes the world of internet of things (IoT) sensors and AI offers a tantalising opportunity for the business and pet owners alike. While people talking to their pets like Dr Dolittle may seem a bit far-fetched, she says: “Our pets do talk to us with their movements, their facial expressions.” Inevitably, many pet owners may miss these subtle signs, but AI could offer a way to spot these.
Ballingit sees an opportunity to use sensor data to help quantify animal behaviour and then apply AI to translate the sensor data into something humans can understand. In a world where digital technologies have made people ever-more disconnected from the real world, tech innovation may one day offer a way for pet parents to have a closer relationship with their furry friends.
And thanks to a mention in Dan Brown’s new novel, The Secret of Secrets, the festival has gained even more global recognition. Just a few weeks after the release of Brown’s new bestseller set in contemporary Prague, viewers were able to see for themselves what drew the popular writer to the festival, which is the largest Czech and Central European showcase of digital art. In one passage, the Signal Festival has a cameo appearance when the novel’s protagonist recalls attending an event at the 2024 edition.
“We’re happy about it,” festival director Martin Pošta says about the mention. “It’s a kind of recognition.” Not that the event needed promotion, even in one of the most anticipated novels of recent years. The organizers have yet to share the number of visitors to the festival this year, but the four-day event typically attracts half a million visitors.
On the final day, there was a long queue in front of the monumental installation Tristan’s Ascension by American video art pioneer Bill Viola before it opened for the evening, even though it was a ticketed event. In the Church of St. Salvator in the Convent of St. Agnes, visitors could watch a Christ-like figure rise upwards, streams of water defying gravity along with him, all projected on a huge screen.
The festival premiere took place on the Vltava River near the Dvořák Embankment. Taiwan’s Peppercorns Interactive Media Art presented a projection on a cloud of mist called Tzolk’in Light. While creators of other light installations have to deal with the challenges of buildings—their irregular surfaces, decorative details, and awkward cornices—projecting onto water droplets is a challenge of a different kind with artists having to give up control over the resulting image. The shape and depth of the Peppercorns’ work depended on the wind at any given moment, which determined how much of the scene was revealed to viewers and how much simply blown away. The reward, however, was an extraordinary 3D spectacle reminiscent of a hologram—something that can’t be achieved with video projections on static and flat buildings.
Another premiere event was a projection on the tower of the Old Town Hall, created for the festival by the Italian studio mammasONica. It transformed the 230-foot structure into a kaleidoscope of blue, green, red, and white surfaces. A short distance away, on Republic Square, Peppercorns had another installation. On a circular LED installation, they projected a work entitled Between Mountains and Seas, which recounted the history of Taiwan.
Software development is associated with the idea of not reinventing the wheel, which means developers often select components or software libraries with pre-built functionality, rather than write code to achieve the same result.
There are many benefits of this approach. For example, a software component that is widely deployed is likely to have undergone extensive testing and debugging. It is considered tried and trusted, mature technology, unlike brand-new code, which has not been thoroughly debugged and may inadvertently introduce unknown cyber security issues into the business.
The Lego analogy is often used to describe how these components can be put together to build enterprise applications. Developers can draw on functionality made available through application programming interfaces (APIs), which provide programmatic access to software libraries and components.
Increasingly, in the age of data-driven applications and greater use of artificial intelligence (AI), API access to data sources is another Lego brick that developers can use to create new software applications. And just as is the case with a set of old-school Lego bricks, constructing the application from the numerous software components available is left to the creativity of the software developer.
A Lego template for application development
To take the Lego analogy a bit further, there are instructions, templates and pathways developers can be encouraged to follow to build enterprise software that complies with corporate policies.
A developer self-service platform provides a way for organisations to offer their developers almost pre-authorised assets, artefacts and tools that they can use to develop code Roy Illsley, Omdia
Roy Illsley, chief analyst, IT operations, at Omdia, defines an internal developer platform (IDP) as a developer self-service portal to access the tools and environments that the IT strategy has defined the organisation should standardise on. “A developer self-service platform provides a way for organisations to offer their developers almost pre-authorised assets, artefacts and tools that they can use to develop code,” he says.
The basic idea is to provide a governance framework with a suite of compliant tools. Bola Rotibi, chief of enterprise research at CCS Insight, says: “A developer self-service platform is really about trying to get a governance path.”
Rotibi regards the platform as “a golden path”, which provides developers who are not as skilled as more experienced colleagues a way to fast-track their work within a governance structure that allows them a certain degree of flexibility and creativity.
As to why offering flexibility to developers is an important consideration falls under the umbrella of developer experience and productivity. SnapLogic effectively provides modern middleware. It is used in digital transformation projects to connect disparate systems, and is now being repositioned for the age of agentic AI.
SnapLogic’s chief technology officer, Jeremiah Stone, says quite a few of the companies it has spoken to that identify as leaders in business transformation regard a developer portal offering self-service as something that goes hand-in-hand with digital infrastructure and AI-powered initiatives.
SnapLogic’s platform offers API management and service management, which manages the lifecycle of services, version control and documentation through a developer portal called the Dev Hub.
Stone says the capabilities of this platform extend from software developers to business technologists, and now AI users, who, he says, may be looking for a Model Context Protocol (MCP) endpoint.
Such know-how captured in a self-service developer portal enables users – whether they are software developers, or business users using low-code or no-code tooling – to connect AI with existing enterprise IT systems.
Enter Backstage
One platform that seems to have captured the minds of the developer community when it comes to developer self-service is Backstage. Having begun life internally at audio streaming site Spotify, Backstage is now an open source project managed by the Cloud Native Computing Foundation (CNCF).
While many teams that implemented Backstage assumed that it would be an easy, free addition to their DevOps practices, that isn’t always the case. Backstage can be complex and requires engineering expertise to assemble, build and deploy Christopher Condo and Lauren Alexander, Forrester
Pia Nilsson, senior director of engineering at the streaming service, says: “At Spotify, we’ve learned that enabling developer self-service begins with standardisation. Traditional centralised processes create bottlenecks, but complete decentralisation can lead to chaos. The key is finding the middle ground – standardisation through design, where automation and clear workflows replace manual oversight.”
Used by two million developers, Backstage is an open source framework for building internal developer portals. Nilsson says Backstage provides a single, consistent entry point for all development activities – tools, services, documentation and data. She says this means “developers can move quickly while staying aligned with organisational standards”.
Nilsson points out that standardising the fleet of components that comprise an enterprise technology stack is sometimes regarded as a large migration effort, moving everyone onto a single version or consolidating products into one. However, she says: “While that’s a critical part of standardising the fleet, it’s even more important to figure out the intrinsic motivator for the organisation to keep it streamlined and learn to ‘self-heal’ tech fragmentation.”
According to Nilsson, this is why it is important to integrate all in-house-built tools, as well as all the developer tools the business has purchased, in the same IDP. Doing so, she notes, makes it very easy to spot duplication. “Engineers will only use what they enjoy using, and we usually enjoy using the stuff we built ourselves because it’s exactly what we need,” she says.
The fact that Backstage is a framework is something IT leaders need to consider. In a recent blog post, Forrester analysts Christopher Condo and Lauren Alexander warned that most IDPs are frameworks that require assembly: “While many teams that implemented Backstage assumed that it would be an easy, free addition to their DevOps practices, that isn’t always the case. Backstage can be complex and requires engineering expertise to assemble, build and deploy.”
However, Forrester also notes that commercial IDP options are now available that include an orchestration layer on top of Backstage. These offer another option that may be a better fit for some organisations.
AI in an IDP
As well as the assembly organisations will need to carry out if they do not buy a commercial IDP, AI is revolutionising software development, and its impact needs to be taken into account in any decisions made around developer self-service and IDP.
Spotify’s Nilsson believes it is important for IT leaders to figure out how to support AI tooling usage in the most impactful way for their company.
“Today, there is both a risk to not leveraging enough AI tools or having it very unevenly spread across the company, as well as the risk that some teams give in to the vibes and release low-quality code to production,” she says.
According to Nilsson, this is why the IT team responsible for the IDP needs to drive up the adoption of these tools and evaluate the impact over time. “At Spotify, we drive broad AI adoption through education and hack weeks, which we promote through our product Skill Exchange. We also help engineers use context-aware agentic tools,” she adds.
Looking ahead
In terms of AI tooling, an example of how developer self-service could evolve is the direction of travel SAP looks to be taking with its Joule AI copilot tool.
When structure, automation and visibility are built into the developer experience, you replace bottlenecks with flow and create an environment where teams can innovate quickly, confidently and responsibly Pia Nilsson, Spotify
CCS Insights’ Rotibi believes the trend to integrate AI into developer tools and platforms is an area of opportunity for developer self-service platforms. Among the interesting topics Rotibi saw at the recent SAP TechEd conference in Berlin was the use of AI in SAP Joule.
SAP announced new AI assistants in Joule, which it said are able to coordinate multiple agents across workflows, departments and applications. According to SAP, these assistants plan, initiate and complete complex tasks spanning finance, supply chain, HR and beyond.
“SAP Joule is an AI interface. It’s a bit more than just a chatbot. It is also a workbench,” says Rotibi. Given that Joule has access to the SAP product suite, she notes that, as well as providing access, Joule understands the products. “It knows all the features and functions SAP has worked on, and, behind the scenes, uses the best data model to get the data points the user wants,” she says.
Recognising that enterprise software developers will want to build their own applications and create their own integration between different pieces of software, she says SAP Joule effectively plays the role of a developer self-service portal for the SAP product suite.
Besides what comes next with AI-powered functionality, there are numerous benefits in offering developer self-service to improve the overall developer experience, but there needs to be structure and standards.
Nilsson says: “When structure, automation and visibility are built into the developer experience, you replace bottlenecks with flow and create an environment where teams can innovate quickly, confidently and responsibly.”
First a confession: I own more MoonSwatches than I care to admit. Never let it be said that WIRED does not walk the walk when it comes to recommending products—Swatch has assiduously extracted a considerable amount of cash from me, all in $285 increments. This was no doubt the Swiss company’s dastardly plan all along, to lure us in, then, oh so gently, get watch fans hooked. The horological equivalent of boiling a frog. It’s worked, too—Swatch has, so far, netted hundreds of millions of dollars from MoonSwatch sales.
But while I’ve been a fan of the Omega X Swatch mashup since we reported on exactly how the hugely lucrative collaboration came to be in the first place, I have never liked the iterative Moonshine Gold versions. Employing a sliver of Omega’s exclusive 18K pale yellow gold alloy in marginally different ways on each design, they seemed almost cynical—a way of milking the MoonSwatch superfans on the hunt to complete the set.
A hidden Snoopy message on the Cold Moon’s dial is revealed under UV light.
Photograph: Courtesy of Swatch
The MoonSwatch comes with a rubber strap upgrade over the original launch models.
Photograph: Courtesy of Swatch
Now, though, just when I thought I was done with MoonSwatch—having gone as far as to upgrade all of mine with official $45 color-matching rubber straps—Swatch has managed to ensnare me once again, and with a Moonshine Gold model: the new MoonSwatch Mission To Earthphase Moonshine Gold Cold Moon.
Clumsy moniker aside, this version takes the all-white 2024 Snoopy model (WIRED’s top pick of the entire collection), mixes it with the Earthphase MoonSwatches, and replaces the inferior original strap for a superior white and blue Swatch rubber velcro one. Aesthetically, it’s definitely a win, but this is not the Cold Moon’s party trick.
On each $450 Cold Moon MoonSwatch, a snowflake is lasered onto its Moonshine Gold moon phase indicator—and, just like a real snowflake, Swatch claims each one will be completely unique. When you consider the volumes of MoonSwatches Swatch produces each year, this is no mean feat.
The unique golden snowflakes appear on the moon phase dial of the Cold Moon.