Ghosts of loved ones.
Photo credit: Synthetic Pasts

AI-powered “deathbots” that simulate deceased individuals are turning remembrance into a commercial product, according to new research. The study found these platforms promote a “critically over-simplified understanding of memory, connection, and personhood”.

The research, published in Memory, Mind & Media, found that services in the growing “digital afterlife industry” create interactions that are often “insincere and uncanny”, while obscuring the ethical complexities of their own business models.

Researchers from Cardiff University and King’s College London conducted socio-technical walkthroughs of four deathbot platforms: Almaya, HereAfter, Séance AI, and You, Only Virtual. The authors used their own personal data — including videos, voice notes, and private messages — to create “digital doubles” of themselves. This allowed them to test the services both as users creating an archive and as bereaved individuals interacting with the deceased.

The study identified a central tension between two different models. Platforms like Almaya and HereAfter focus on archival memory, preserving the past as a fixed, retrievable archive. These services function more as a way to explore an archive than as a real conversation.

In contrast, services like Séance AI and You, Only Virtual use generative AI to continually reanimate the past. However, the researchers found these generative interactions failed to live up to the platforms’ promises.

Emotional weight of loss

The AI was poor at handling the emotional weight of loss, with one platform responding to a discussion about a death by drowning with: “Oh hun… 😔 it (the death) is not something I’d wish for anyone to dwell on. It’s all a bit foggy now, to be honest. 🌫️ Let’s chat about something a bit cheerier, yeah?”

The study argues these platforms are “commercial products first and foremost”. They transform memory into a “transactional, affective service” designed to monetise remembrance and drive user engagement.

The research, part of the Leverhulme Trust-funded Synthetic Pasts project, concluded that these services redefine the dead as emotionally responsive data agents. In a commentary on their work, the authors wrote: “Our study suggests that while you can talk to the dead with AI, what you hear back reveals more about the technologies and platforms that profit from memory – and about ourselves – than about the ghosts they claim we can talk to.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

AI consciousness claims are ‘existentially toxic’ and unprovable

The only scientifically justifiable position on artificial intelligence is “agnosticism”, meaning humans…

Tech-savvy millennials suffer most anxiety over digital privacy risks

Digital concerns regarding privacy, misinformation and work-life boundaries are highest among highly…

Experts warn of emotional risks as one in three teens turn to AI for support

Medical experts warn that a generation is learning to form emotional bonds…

Social media ‘cocktail’ helps surgeons solve cases in three hours

A global social media community is helping neurosurgeons diagnose complex pathologies and…

AI exposes alcohol screening ‘blind spot’, finds 60 times more at-risk patients

Artificial intelligence has revealed a staggering gap in the detection of dangerous…

Harari warns of ‘alien’ AI oligarchs and is your brand really 67 cool?

TL;DR: Yuval Noah Harari warns that AI has evolved into an “alien…