Connect with us

Tech

Interview: Differentiating with AI in pet care | Computer Weekly

Published

on

Interview: Differentiating with AI in pet care | Computer Weekly


Over the past year, Kate Balingit has been leading the digital health initiative at Mars Nutrition, reporting to the company’s pet care chief information officer, where she is focused on commercialising and deploying artificial intelligence (AI) through the Mars pet nutrition brands. These include well-known pet foods brands such as Pedigreee, Iams, Royal Canine, Sheba and Whiskas.

“Even though we’re building tech products, Mars is a non-tech company,” says Balingit, whose official job title is Mars Petcare head of digital innovation. “We kind of abide by the same standards of scientific credibility and scientific rigour that apply to our primary business of food.”

A former Googler, who was also involved in Waze, Balingit joined Mars Petcare in 2022 to head up Whistle, the “FitBit for dogs” company Mars acquired in 2016 (see Career at Google) .

She says Mars Petcare has made a large commitment to digitising the pet care business. This includes everything from upskilling staff to digitising factories and its supply chain, as well as elevating the e-commerce experiences. Digitisation also covers emerging technologies such as using agentic AI for automating workflows and mining digital health data.

On the AI front, rather than rely on existing large language models (LLMs), she says the business is focused on building the computer vision algorithms itself: “We’re building image classifiers to detect signs of emerging health conditions and enterprise software components that enable us to create user experiences that can safely live on our brand digital properties. It comes down to differentiated assets – our proprietary data sets bootstrap an image database and then we work with vets to label the images and train the algorithm.”

She says these algorithms go through the same kind of scientific governance rigor as the food part of the business. “We do have to be able to say where we sourced our data. We’re also very explicit about publishing how we train the models.” This, she says, is a differentiator. “You don’t get a free pass just because you’re working with algorithms. At a non-tech company, you have to abide by the same quality standards that apply to the entire business.”

Among the challenges the company aims to address is how to build products and digital experiences that meet the unique needs of individual brands, individual business units and offers a unique differentiator. A lot of the work involves its data architecture for structuring all of the data that the company collects from pet parents who use the apps and applications the company develops.

“We’re working with emerging technologies like computer vision and trying to build products with a platform approach to enable us to repurpose these assets in different types of applications,” she says. “My team takes a very component-based approach. I don’t see us building products. Instead, we are building a series of capabilities.”

Digitising pet care

There are around 200 people working in the digital transformation organisation at Mars Petcare. Balingit’s role involves orchestrating initiatives across three core functions: science, data science and software engineering.

“The digital health initiative starts with science; we’re building scientific instruments,” she says. These algorithms are capable of detecting the emerging presence of health conditions in dogs. “I start by partnering with the global R&D [research and development] science function, which includes specialists in oral health, skin health, gut health and healthy ageing.”

The team put together a specification for the product, such as deciding on the symptoms of a health condition that the software and AI it produces will be able to detect. The data science team is used to build the algorithm to detect the health condition.

“In the case of a canine dental check, we’re detecting plaque, tartar and gum irritation. I work with our data science team to build the algorithm – we have to acquire the training data and we have to label it, then we build the computer vision models using Azure developer tools.”

The algorithm is made available via an application programming interface (API). Balingit then works with the software engineering team on the actual product experience. “It’s a truly cross-functional effort,” she says.

The software not only needs to meet the high standards associated with the brand, but a high bar is also set for the enterprise architecture, data security and data privacy. With these high standards, Balingit says: “Data science and software engineering can do something really special, which is to scale scientific understanding and put these capabilities into the hands of pet parents around the world through our biggest brands.”

Greenies is a recent example of one of the brands with an AI tool. “Our use of AI in the Greenies Canine Dental Check tool started with a pet parent insight. We know that 80% of dogs have signs of periodontal disease by the age of three, but 72% of pet parents think that their dog’s oral health is fine,” she says.

The team wanted to address this awareness gap among pet owners using AI to, as Balingit puts it, “make the invisible visible and help people to understand that their dog is experiencing an oral health issue.”

“We’re very explicit about publishing how we train the models. You don’t get a free pass just because you’re working with algorithms”

Kate Balingit, Mars Petcare

The Greenies Canine Dental Check required a computer vision algorithm trained on more than 50,000 images of dogs. “We built an algorithm that was capable of taking a smartphone image to understand if the photograph is of a dog and, if it is, if it’s showing the dog’s mouth and its teeth are visible.” The algorithm then needs to analyse the image to determine whether the tooth has visual signs of oral disease. 

When asked about the success in capturing teeth in a pet dog’s mouth, she says: “We always encourage caution. But when I’ve looked at the data, the average user captures about 10.2 teeth in the photo itself.” So, while it may seem a major undertaking for pet owners to attempt taking smartphone photos of their dog’s mouth with visible teeth, in Balingit’s experience, pet parents are “very capable”.

Another consideration is the level of accuracy. Balingit says: “No algorithm is going to be 100% accurate. A human is not 100% accurate. What’s really important is that we are not building a diagnostic device. Our goal was to build a health-screening instrument that could find visual indicators of an emerging disease.” As such, the level of accuracy it can achieve of 97% is good enough.

An approach to business AI

As Balingit notes: “AI is just top of mind for everybody right now.” Like many businesses deploying AI applications, she points out that the past two years have been “a whirlwind”, which means companies such as Mars Petcare need to figure out what they should be doing with AI.

“It’s important to be intentional about what we’re doing, and the key question for me is, ‘What do we at Mars Petcare have that an AI company in Silicon Valley doesn’t have? What are our unique assets and how do we build an AI innovation agenda on top of them?’”

Looking to the future and advances in digital technologies, Balingit believes the world of internet of things (IoT) sensors and AI offers a tantalising opportunity for the business and pet owners alike. While people talking to their pets like Dr Dolittle may seem a bit far-fetched, she says: “Our pets do talk to us with their movements, their facial expressions.” Inevitably, many pet owners may miss these subtle signs, but AI could offer a way to spot these.

Ballingit sees an opportunity to use sensor data to help quantify animal behaviour and then apply AI to translate the sensor data into something humans can understand. In a world where digital technologies have made people ever-more disconnected from the real world, tech innovation may one day offer a way for pet parents to have a closer relationship with their furry friends.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

These Sub-$300 Hearing Aids From Lizn Have a Painful Fit

Published

on

These Sub-0 Hearing Aids From Lizn Have a Painful Fit


Don’t call them hearing aids. They’re hearpieces, intended as a blurring of the lines between hearing aid and earbuds—or “earpieces” in the parlance of Lizn, a Danish operation.

The company was founded in 2015, and it haltingly developed its launch product through the 2010s, only to scrap it in 2020 when, according to Lizn’s history page, the hearing aid/earbud combo idea didn’t work out. But the company is seemingly nothing if not persistent, and four years later, a new Lizn was born. The revamped Hearpieces finally made it to US shores in the last couple of weeks.

Half Domes

Photograph: Chris Null

Lizn Hearpieces are the company’s only product, and their inspiration from the pro audio world is instantly palpable. Out of the box, these look nothing like any other hearing aids on the market, with a bulbous design that, while self-contained within the ear, is far from unobtrusive—particularly if you opt for the graphite or ruby red color scheme. (I received the relatively innocuous sand-hued devices.)

At 4.58 grams per bud, they’re as heavy as they look; within the in-the-ear space, few other models are more weighty, including the Kingwell Melodia and Apple AirPods Pro 3. The units come with four sets of ear tips in different sizes; the default mediums worked well for me.

The bigger issue isn’t how the tip of the device fits into your ear, though; it’s how the rest of the unit does. Lizn Hearpieces need to be delicately twisted into the ear canal so that one edge of the unit fits snugly behind the tragus, filling the concha. My ears may be tighter than others, but I found this no easy feat, as the device is so large that I really had to work at it to wedge it into place. As you might have guessed, over time, this became rather painful, especially because the unit has no hardware controls. All functions are performed by various combinations of taps on the outside of either of the Hearpieces, and the more I smacked the side of my head, the more uncomfortable things got.



Source link

Continue Reading

Tech

Two Thinking Machines Lab Cofounders Are Leaving to Rejoin OpenAI

Published

on

Two Thinking Machines Lab Cofounders Are Leaving to Rejoin OpenAI


Thinking Machines cofounders Barret Zoph and Luke Metz are leaving the fledgling AI lab and rejoining OpenAI, the ChatGPT-maker announced on Thursday. OpenAI’s CEO of applications, Fidji Simo, shared the news in a memo to staff Thursday afternoon.

The news was first reported on X by technology reporter Kylie Robison, who wrote that Zoph was fired for “unethical conduct.”

A source close to Thinking Machines said that Zoph had shared confidential company information with competitors. WIRED was unable to verify this information with Zoph, who did not immediately respond to WIRED’s request for comment.

Zoph told Thinking Machines CEO Mira Murati on Monday he was considering leaving, then was fired today, according to the memo from Simo. She goes on to write that OpenAI doesn’t share the same concerns about Zoph as Murati.

The personnel shake-up is a major win for OpenAI, which recently lost its VP of research, Jerry Tworek.

Another Thinking Machines Lab staffer, Sam Schoenholz, is also rejoining OpenAI, the source said.

Zoph and Metz left OpenAI in late 2024 to start Thinking Machines with Murati, who had been the ChatGPT-maker’s chief technology officer.

This is a developing story. Please check back for updates.



Source link

Continue Reading

Tech

Tech Workers Are Condemning ICE Even as Their CEOs Stay Quiet

Published

on

Tech Workers Are Condemning ICE Even as Their CEOs Stay Quiet


Since Donald Trump returned to the White House last January, the biggest names in tech have mostly fallen in line with the new regime, attending dinners with officials, heaping praise upon the administration, presenting the president with lavish gifts, and pleading for Trump’s permission to sell their products to China. It’s been mostly business as usual for Silicon Valley over the past year, even as the administration ignored a wide range of constitutional norms and attempted to slap arbitrary fees on everything from chip exports to worker visas for high-skilled immigrants employed by tech firms.

But after an ICE agent shot and killed an unarmed US citizen, Renee Nicole Good, in broad daylight in Minneapolis last week, a number of tech leaders have begun publicly speaking out about the Trump administration’s tactics. This includes prominent researchers at Google and Anthropic, who have denounced the killing as calloused and immoral. The most wealthy and powerful tech CEOs are still staying silent as ICE floods America’s streets, but now some researchers and engineers working for them have chosen to break rank.

More than 150 tech workers have so far signed a petition asking for their company CEOs to call the White House, demand that ICE leave US cities, and speak out publicly against the agency’s recent violence. Anne Diemer, a human resources consultant and former Stripe employee who organized the petition, says that workers at Meta, Google, Amazon, OpenAI, TikTok, Spotify, Salesforce, Linkedin, and Rippling are among those who have signed. The group plans to make the list public once they reach 200 signatories.

“I think so many tech folks have felt like they can’t speak up,” Diemer told WIRED. “I want tech leaders to call the country’s leaders and condemn ICE’s actions, but even if this helps people find their people and take a small part in fighting fascism, then that’s cool, too.”

Nikhil Thorat, an engineer at Anthropic, said in a lengthy post on X that Good’s killing had “stirred something” in him. “A mother was gunned down in the street by ICE, and the government doesn’t even have the decency to perform a scripted condolence,” he wrote. Thorat added that the moral foundation of modern society is “infected, and is festering,” and the country is living through a “cosplay” of Nazi Germany, a time when people also stayed silent out of fear.

Jonathan Frankle, chief AI scientist at Databricks, added a “+1” to Thorat’s post. Shrisha Radhakrishna, chief technology and chief product officer of real estate platform Opendoor, replied that what happened to Good is “not normal. It’s immoral. The speed at which the administration is moving to dehumanize a mother is terrifying.” Other users who identified themselves as employees at OpenAI and Anthropic also responded in support of Thorat.

Shortly after Good was shot, Jeff Dean, an early Google employee and University of Minnesota graduate who is now the chief scientist at Google DeepMind and Google Research, began re-sharing posts with his 400,000 X followers criticizing the Trump administration’s immigration tactics, including one outlining circumstances in which deadly force isn’t justified for police officers interacting with moving vehicles.

He then weighed in himself. “This is completely not okay, and we can’t become numb to repeated instances of illegal and unconstitutional action by government agencies,” Dean wrote in an X post on January 10. “The recent days have been horrific.” He linked to a video of a teenager—identified as a US citizen—being violently arrested at a Target in Richfield, Minnesota.

In response to US Vice President JD Vance’s assertion on X that Good was trying to run over the ICE agent with her vehicle, Aaron Levie, the CEO of the cloud storage company Box, replied, “Why is he shooting after he’s fully out of harm’s way (2nd and 3rd shot)? Why doesn’t he just move away from the vehicle instead of standing in front of it?” He added a screenshot of a Justice Department webpage outlining best practices for law enforcement officers interacting with suspects in moving vehicles.





Source link

Continue Reading

Trending