Now AI can bring back the dead

3 minute read


Social media is a ripe virtual graveyard for today’s resurrection men. And yes, there is a Black Mirror episode.


The older your Back Page scribbler gets the more often he is struck by instances of life imitating art, but not in a good way.

Never is this more so than in the realm of social media, where the rule of thumb seems to be: if it can be done, it will be done, regardless of the consequences.

The most recent iteration of this phenomenon comes in the form of the disturbingly named “deathbots”.

Which are?  

Simply speaking, deathbots are computer programs that simulate a dead individual’s conversation, allowing the bereaved to interact with a posthumous synthetic version of their loved one.

Using generative AI systems, the deathbots draw on the text messages, voice messages, emails and social media posts of the deceased person to mimic their conversational behaviour and writing styles.

Now your correspondent is by no means a luddite when it comes to new technology, but this concept raises more red flags than a meeting of the State Council of the People’s Republic of China.

And we are not alone in our misgivings.

New research published by Macquarie University’s department of philosophy, examines the impact that deathbots might have on the grieving process and the ethical implications of this technology.

According to researchers Dr Regina Fabry and Associate Professor Mark Alfano, who studied accounts of human-deathbot interactions over several years, the use of deathbots has an upside.

“From an optimistic perspective, deathbots can be understood as technological resources that can shape and regulate emotional experiences of grief,” Dr Fabry told media.

“Researchers suggest that interactions with a deathbot might allow the bereaved to continue ‘habits of intimacy’ such as conversing, emotional regulation and spending time together.”

On the downside, the deathbots have the potential to create a “zombified” version of the deceased which would run entirely counter to the wishes of that person while they were alive. 

The positive or negative impact of deathbots on grief, Dr Fabry said, also depended on the attitudes of the bereaved towards the conversational possibilities and limitations of this technology.

“Is a bereaved person aware that they are chatting with a deathbot, one that will eventually commit errors? Or does a bereaved person, at least at times, feel as if they are, literally, conversing with the dead?” she said.

There is also the danger replacing a lost human relationship with a digitally generated one could create a sense of self-deception and delusion for the bereaved. 

“To prevent the occurrence of this problem, we recommend the implementation of ‘automated guardrails’ to detect whether a bereaved person becomes overly dependent on their interactions with a deathbot,” Dr Fabry said.

“Furthermore, we recommend that interactions with a deathbot should be supervised by a grief counsellor or therapist.”

Which is all very well and good, but ignores the reality that deathbots are created by social media companies who are fundamentally guided by the desire to generate income from their products. Good luck getting those folks to put up any “guardrails” that might get in the way of making a tidy sum of money out of a grieving person.

If you want a deep dive into just how badly the whole deathbot experience could play out, then there’s an episode of the dystopian TV series Black Mirror, called “Be Right Back”, which will confirm all your darkest fears about the overreach of social media into our lives … and deaths.

Send story tips from beyond the grave to penny@medicalrepublic.com.au.

End of content

No more pages to load

Log In Register ×