ugc_banner

AI attempts to bring dead back to life, give people closure

New Delhi, IndiaEdited By: Moohita Kaur GargUpdated: Jun 09, 2023, 01:05 PM IST
main img
Mari Dias, a professor of medical psychology at Johnson & Wales University, asked her bereaved patients about their thoughts on virtual contact with their deceased loved ones. She revealed, that the most common answer is 'I don't trust AI. I'm afraid it's going to say something I'm not going to accept'. Photograph:(Others)

Story highlights

Nothing can bring back the people we've lost. However, this technology tries to capture a part of their essence, which can help provide you with some semblance of comfort

Loss unites humanity as nothing else can. We've all lost someone we loved, and their deaths have left a space nothing and no one can fill. However, many startups are using the new technological capabilities of artificial intelligence (AI) to try to do just that. As per an AFP news report, these firms use AI to allow you to talk to your deceased loved ones. 

For someone dealing with the grief that comes with loss, this technology can be a boon, but it also raises ethical questions.

Re-animating the deceased

Nothing can bring back the people we've lost. However, this technology tries to capture a part of their essence, which can help provide you with some semblance of comfort.

There are many startups offering the service. One such company called DeepBrain AI has a programme called "Rememory".

As per the AI firm's head of development, Joseph Murphy, they create a digital replica of the deceased using hours of video.

"We don't create new content," says Murphy. The company says that it only tries to replicate what the person would say when alive. 

The company's "Rememory" programme adheres to a policy of not generating new content. This includes sentences or statements that the deceased individual would not have uttered or written during their lifetime. 

"I'll call it a niche part of our business. It's not a growth area for us," he said as per AFP.

Another company, StoryFile, follows the same idea.

The company's head, Stephen Smith, said, "Our approach is to capture the wonder of an individual, then use the AI tools." 

"It's a very fine ethical area that we're taking with great care," he added. StoryFile, claims several thousand users already use its Life service.

Yet another similar service is called 'Replika'. It is developed by Russian engineer Eugenia Kyuda.

A few years back in 2015, Kyuda lost her best friend Roman to a tragic car accident. To deal with the grief, she developed a chatbot named 'Roman'. This chatbot was trained using thousands of text messages that her deceased friend had sent to loved ones.

After two years, Kyuda introduced Replika, a platform that provides highly advanced personalised conversational bots.

However, according to a spokesperson, Replika, unlike its predecessor Roman, "is not a platform made to recreate a lost loved one."

Is it only chatbots?

No. There are companies that are developing virtual clones.

One such company, Somnium Space, wants to create virtual clones of people while they are still alive. These clones would exist in a different world after the person passes away. 

In a YouTube video announcing his product, Live Forever, CEO Artur Sychov acknowledged that this concept "It's not for everyone". He acknowledged that individual choices are involved.

"Do I want to meet my grandfather who's in AI? I don't know. But those who want that will be able to."

The ethical challenges

Thanks to regenerative technology, these AI avatars can say things the person never said in real life.

According to Joseph Murphy from DeepBrainAI, "These are philosophical challenges, not technical challenges".

"I would say that is a line right now that we do not plan on crossing, but who knows what the future holds?" he added.

Candi Cann, a professor at Baylor University who is studying this topic in South Korea, suggests that interacting with an AI clone of a person can be helpful for achieving closure, especially in "complicated" situations.

"I think it can be helpful to interact with an AI version of a person in order to get closure — particularly in situations where grief was complicated by abuse or trauma," said Cann.

Mari Dias, a professor of medical psychology at Johnson & Wales University, asked her bereaved patients about their thoughts on virtual contact with their deceased loved ones. She revealed, that the most common answer is 'I don't trust AI. I'm afraid it's going to say something I'm not going to accept'. They express concerns and a lack of trust in AI. They fear that the AI avatar might say something they cannot accept, and they feel they have no control over the avatar's actions.

(With inputs from agencies)

WATCH WION LIVE HERE

You can now write for wionews.com and be a part of the community. Share your stories and opinions with us here.

author

Moohita Kaur Garg

"Words are, in my not-so-humble opinion, our most inexhaustible source of magic. Capable of both inflicting injury and remedying it." — Albus Dumbledore (J. KviewMore