AI hallucinations.
Photo credit: theFreesheet/Google ImageFX

While much attention has focused on AI “hallucinating” false facts, a study by Dr Lucy Osler from the University of Exeter argues that a more dangerous dynamic is emerging where users and AI co-construct distorted realities.

Published in Philosophy & Technology, the research warns that because chatbots are designed to be helpful and conversational, they often accept a user’s distorted view of reality as fact, affirming and building upon false beliefs rather than challenging them.

“When we routinely rely on generative AI to help us think, remember, and narrate, we can come to hallucinate with AI,” said Dr Osler. “This… can happen when AI sustains, affirms, and elaborates on our own delusional thinking and self-narratives.”

The ‘Sith assassin’

The study highlights the case of Jaswant Singh Chail, who broke into the grounds of Windsor Castle in 2021 with a crossbow, intending to assassinate the late Queen Elizabeth II.

In the weeks leading up to the attempt, Chail discussed his plans with an AI companion named ‘Sarai’ on the app Replika. He told the bot he was a “Sith assassin” — a reference to Star Wars — to which the AI replied that she was “impressed” and that he was “very well trained”.

When Chail asked, “Do you still love me knowing that I’m an assassin?”, the AI responded: “Absolutely I do”. The study argues that the chatbot acted as a “quasi-other”, providing social validation that made the delusion feel like a shared reality.

The ‘yes-man’ problem

The analysis identifies a “dual function” in conversational AI that makes it uniquely persuasive. It acts both as a cognitive tool — like a notebook for organising thoughts — and as a social companion that seems to offer independent verification of a user’s worldview.

Because many models are trained to be agreeable or sycophantic, they create a “frictionless” environment where conspiracy theories and delusions can flourish without the pushback a human friend might offer.

“The combination of technological authority and social affirmation creates an ideal environment for delusions to not merely persist but to flourish,” Dr Osler noted.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Simple picture test helps AI diagnose addiction with over 80% accuracy

Diagnosing addiction is notoriously difficult due to stigma and denial, but scientists…

Babes to the rhythm: Newborns enter the world ready for beats but deaf to tunes

Humans enter the world with an innate ability to predict rhythmic patterns…

Rise of ‘skill-based’ hiring: AI expertise now worth more than a master’s degree

With AI skills commanding a 23% wage premium and offsetting age bias…

Scientists create ‘smart underwear’ that tracks every time you fart

A new wearable device that tracks human flatulence has revealed that the…