Tech

Your chatbot doesn’t love you: The ‘illusion’ of social AI

Published

on


Credit: Pixabay/CC0 Public Domain

Every day, millions of people talk to chatbots and AI assistants such as ChatGPT, Replika and Gemini, but what kind of “relationships” are we really forming with them?

In a special issue of the journal New Media & Society, Dr. Iliana Depounti (Loughborough University) and Associate Professor Simone Natale (University of Turin) explore the rise of “artificial sociality”—technologies that simulate and without actually possessing them.

Their article, “Decoding Artificial Sociality: Technologies, Dynamics, Implications,” reveals a number of issues associated with the rise of Large Language Models (LLMs) and AI chatbots.

It argues that the illusion of friendship or understanding created by AI is being deliberately cultivated by to increase user engagement, such as Spotify’s “AI DJ” with a friendly human voice and Replika’s “virtual companion” chatbots.

Dr. Depounti said, “Companion generative AI bots such as Replika or Character AI exemplify artificial sociality technologies.

“They are created to foster emotional projection, offering users intimacy and companionship through features like avatars, role-playing, customization and gamification—all with monetary benefits for the companies that design them.

“ChatGPT, too, uses artificial sociality techniques, from referring to itself as ‘I’ to adopting tones of authority, empathy or expertise.

“Though these systems simulate sociality rather than recreate it, their power lies in that simulation—in their ability to engage, persuade and emotionally move millions of users worldwide, raising deep ethical questions.”

The study shows how social cues are engineered into products to keep people interacting longer.

Other issues include:

  • Machines only imitate social behavior, but users still project feelings, trust and empathy onto them.
  • User data and emotional labor are exploited to train and “personalize” AI systems, raising ethical and about hidden human work and massive data-center energy use.
  • Bias and stereotypes in AI systems mirror social inequalities, shaping how gender, class and race are represented in digital conversations.
  • Users adapt to AI “companions” through what researchers call “re-domestication”—renegotiating relationships every time a chatbot’s personality or behavior changes.
  • The line between authenticity and deception is becoming blurred as AI personalities are marketed as “friends,” “co-workers” or even “influencers.”

Dr. Natale said, “Artificial sociality is the new frontier of human–machine communication in our interactions with generative AI technologies.

“These systems don’t feel, but they are designed to make us feel, and that emotional projection has profound social, economic and ethical consequences. Artificial sociality technologies invite and encourage these projections.”

Behind these apparently effortless conversations, the researchers warn, lies a vast infrastructure of human and environmental cost.

AI models rely on huge datasets drawn from people’s online interactions and often from their conversations with the machines themselves.

This data is then used to “train” chatbots to sound more human—sometimes with users unknowingly performing unpaid emotional or linguistic labor.

At the same time, the servers powering generative AI consume enormous amounts of electricity and water.

The authors highlight a $500 billion investment by major tech firms in new data centers to meet AI demand, describing it as part of an “extractive” system that turns into corporate assets.

More information:
Iliana Depounti et al, Decoding Artificial Sociality: Technologies, Dynamics, Implications, New Media & Society (2025). DOI: 10.1177/14614448251359217

Citation:
Your chatbot doesn’t love you: The ‘illusion’ of social AI (2025, November 12)
retrieved 12 November 2025
from https://techxplore.com/news/2025-11-chatbot-doesnt-illusion-social-ai.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version