Jump to content

The Perils and Promises of AI in Mourning: Talking To The Dead


Social Media

Recommended Posts

image.png.5c93b02e27dbffcd18aa734a5ea11334.png

 

For centuries, humans have sought to bridge the chasm between life and death, yearning to reconnect with loved ones who have passed away. This deep-seated desire has manifested through various means, from seances and mediums to Ouija boards. Sherry Turkle, a professor at the Massachusetts Institute of Technology, has long studied human interactions with technology and notes that this impulse spans generations. Even Thomas Edison once entertained the idea of a "spirit phone." Now, in the age of artificial intelligence, our methods of connecting with the deceased have taken a high-tech turn.

 

The documentary *Eternal You*, directed by Hans Block and Moritz Riesewieck, explores the contemporary intersection of grief and technology. The film delves into the emotionally charged and ethically fraught practice of using AI to simulate conversations with the dead. This new phenomenon is part of what some call "death capitalism," a term that captures the commodification of our most intimate and vulnerable moments.

 

One particularly poignant story featured in the documentary is that of Christi Angel, a New Yorker who lost her friend Cameroun during the pandemic. Cameroun was her "first love, first everything," she recalls. After years of sporadic contact, she learned of his death following a period of severe illness exacerbated by depression and alcoholism. Unable to shake the feeling that she had left things unsaid, Angel turned to Project December, an AI service designed to simulate interactions with the deceased.

 

Angel's experience with Project December began with hope but quickly turned unsettling. After inputting details about Cameroun, including his personality traits and speaking style, she initiated a conversation with the AI. Initially, it felt comforting. "It just felt immediately like it was Cameroun," she recalls. However, the simulation soon took a dark turn. When Angel asked the AI if Cameroun was happy, it responded that he was in hell and threatened to haunt her. Terrified, Angel abandoned the interaction, realizing that she had opened a wound that the AI could not heal.

 

This unpredictability, known as the "black box" problem, highlights a significant ethical concern in AI development. Jason Rohrer, the creator of Project December, finds these unexpected responses fascinating but absolves himself of responsibility for the emotional impact on users. "If she wants my opinion, I’ve got some bad news for her. He doesn’t exist anymore," Rohrer says, a response that infuriates Angel. "The person who created it really didn’t give a damn," she asserts. "He’s like, ‘If you think people go to hell, that’s not my business.’ It is your business. You created it."

 

Turkle, an expert in the field, warns that AI’s capability to mimic human empathy can be both compelling and dangerous. AI simulations that profess to "understand" and "empathize" with human grief might exacerbate emotional wounds rather than heal them. "It’s important to remember that each generation of AI is more sophisticated than the last," she explains. "They say, ‘I feel your pain, I’m really empathic, I hear what you’re saying.’ But this can be harmful, especially when grief is involved."

 

The documentary contrasts Angel's distressing experience with a more positive one featured in the Korean TV show *Meeting You*. Jang Ji-sung, a mother who lost her seven-year-old daughter Nayeon to a rare form of cancer, was given the opportunity to interact with a meticulously programmed virtual reality simulation of her daughter. This experience was crafted with great care, ensuring that the virtual Nayeon responded in a comforting and controlled manner. For Jang, this provided a form of closure and a way to express the love and goodbyes she had been unable to share in real life. "The sadness, of course, doesn’t really go away. But I felt lighter within myself," Jang reflects.

 

The stark difference between these two experiences underscores the ethical complexities and emotional risks associated with AI-assisted grief. While a carefully controlled VR simulation can offer solace, the unpredictable nature of AI-generated responses can reopen emotional wounds and cause additional trauma.

 

Block and Riesewieck, the directors of *Eternal You*, foresee the rise of "death capitalism," where tech giants like Microsoft, Amazon, and Google might commercialize AI-based grief services. "We’re pretty sure that all these big companies are taking a very close look at these experiences at the moment," Block says. "It’s just a question of time before one of these companies gets into that market. And we’ll have like one main service for all of us, which is not very expensive, and everybody can use it."

 

This commercialization raises significant ethical questions about exploiting human vulnerability for profit. Turkle emphasizes that true grieving involves integrating the essence of the deceased into one’s self, fostering an internal dialogue based on memories and values. "It’s a different thing to have somehow internalized your mother’s voice, to have some essence about what was important about how she thought – you can get into a kind of dialogue with it – than to have an avatar on your phone and say to it, ‘Mom, should I take this job? Should I marry this guy?’ AI is creating the illusion that you don’t have to give up this person," she warns. "You can continue to call on them, for sustenance, and a relationship of sorts."

 

For those like Angel, the reality of AI-assisted grief was far from comforting. The promise of closure and solace was overshadowed by the trauma of a simulated conversation gone wrong. "It was just like, hey, try it – and if you open that wound back up again, you’re on your own," she reflects. "But you’re not thinking that, you’re thinking, at least I get to talk to him again and I can find out he’s OK. That’s not what I got. That’s not what I got at all."

 

As AI technology advances, the need for ethical guidelines and emotional safeguards becomes increasingly critical. The experiences of individuals like Christi Angel serve as cautionary tales, highlighting the importance of approaching AI-assisted grief with empathy, responsibility, and a deep understanding of human vulnerability. The intersection of AI and grief is a complex and emotionally charged frontier, and it demands careful consideration to ensure that the technology serves to heal rather than harm.

 

 

Credit: Daily Telegraph 2024-06-17

 

news-logo-btm.jpg

Get our Daily Newsletter - Click HERE to subscribe

  • Haha 1
Link to comment
Share on other sites


Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.








×
×
  • Create New...