Chatbots talking to humans.
theFreesheet/Google ImageFX

People can develop deeper feelings of emotional closeness with an artificial intelligence than with a fellow human being, provided they remain unaware they are talking to a machine, according to new research from the Universities of Freiburg and Heidelberg.

A study published in Communications Psychology found that in conversations regarding personal emotions, AI chatbots were significantly more effective at fostering a sense of connection than real people.

However, the spell is fragile: the researchers found that when participants were informed in advance that they were chatting with an algorithm, the feeling of closeness collapsed.

The intimacy algorithm

The research team, led by Professor Markus Heinrichs and Dr. Tobias Kleinert, recruited 492 participants to engage in online chats. The subjects answered personal questions about life experiences and friendships, receiving responses from either a human or an AI language model.

The results showed that when the partner’s identity was hidden, the AI performed surprisingly well. In emotionally charged conversations, the AI surpassed humans at fostering closeness.

The researchers attribute this to “self-disclosure.” While human strangers tend to be socially cautious and reserved when first meeting, the AI had no such inhibitions, offering “personal” information and vulnerability that accelerated the feeling of bonding.

“We were particularly surprised that AI creates more intimacy than human conversation partners, especially when it comes to emotional topics,” says study leader Professor Bastian Schiller of Heidelberg University. “People seem to be more cautious with unfamiliar conversation partners at first, which could initially slow down the development of intimacy.”

The transparency trap

The study highlights a psychological paradox: we prefer the AI’s communication style, yet we reject the label “AI”.

When participants were told they were speaking to a machine, they immediately felt less connected and invested less effort in their responses. This suggests that the “human” connection relies heavily on the belief that there is a mind on the other end, even if the text itself is identical.

The findings point to a double-edged sword for the future of digital companionship. On one hand, AI could serve as a powerful tool for fighting loneliness, offering low-threshold counselling or social support for isolated individuals. On the other hand, the ability of a machine to simulate superior intimacy raises ethical red flags.

“Artificial Intelligence is increasingly becoming a social actor,” says Schiller. “The way we shape and regulate it will decide whether it is a meaningful supplement to social relations — or whether emotional closeness is deliberately manipulated.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Alarming new US survey shows half of patients rely on AI for medical choices

Across the United States, a dangerous new trend is emerging. Millions of…

Why digital tears and online outrage fail to win modern political arguments

Scrolling through your social media feed today often feels like navigating a…

Global gambling firms rush to adopt AI despite severe lack of safety controls

The global gambling industry is racing to integrate artificial intelligence into its…

Students prefer artificial intelligence until they figure out it is a machine

University students prefer to get academic advice from artificial intelligence rather than…

Tracking how war and energy policies dimmed night lights of Europe

While human civilisation is glowing brighter than ever before, the lights across…

Massive AI study uncovers the secret GLP-1 side effects hidden on Reddit

Millions of patients are flocking to GLP-1 weight loss injections, but artificial…