Stephenie Lucas Oney is 75 years old, but still turns to her father for advice. How did she deal with racism, she wonders. How did she manage to succeed despite the adversities against her? The answers are rooted in the experience of William Lucas as a black man from the Harlem neighborhood in New York, who made a living as a police officer, FBI agent, and judge. But Oney doesn’t receive the advice in person. Her father has been dead for over a year. Instead, she listens to the answers, spoken in her father’s voice, on her phone through HereAfter AI, an artificial intelligence-based application that generates responses based on hours of interviews conducted with him before his death in May 2022. His voice comforts her, but she says she created the profile mainly for her four children and eight grandchildren. “I want the kids to hear his voice talking about all those things,” said Oney, an endocrinologist, from her home in Grosse Pointe, Michigan, “and not my voice when I try to paraphrase, but hear it from his point of view, his time, and his perspective.”
Some people are turning to AI technology as a way to communicate with the deceased, but its use as part of the grieving process has raised ethical questions and unsettled some of those who have experimented with it.
HereAfter AI was introduced in 2019, two years after the debut of StoryFile, which produces interactive videos in which subjects appear to make eye contact, breathe, and blink while answering questions. Both generate responses based on what users answered to prompts like “tell me about your childhood” and “what is the biggest challenge you have faced?”
Its appeal doesn’t surprise Mark Sample, a professor of digital studies at Davidson College, who teaches a course called “Death in the Digital Age.” “Whenever a new kind of technology arises, there’s always a need to use it to contact the dead,” Sample explained. He highlighted Thomas Edison’s failed attempt to invent a “spirit phone.”
‘My best friend, he was there’
StoryFile offers a “high fidelity” version where a historian interviews someone in a studio, but there is also a version that only requires a laptop and a webcam to get started. Stephen Smith, co-founder, had his mother, Marina Smith, a Holocaust educator, try it out. Her StoryFile avatar answered questions at her funeral in July.
According to StoryFile, nearly 5,000 people have created profiles. Among them is actor Ed Asner, who was interviewed eight weeks before his death in 2021.
The company sent Asner’s StoryFile to his son Matt Asner, who was amazed to see his father looking at him and giving the impression of answering questions. “I was very surprised,” Asner said. “I found it incredible how I could have that interaction with my father, which was relevant and meaningful, and it was his personality. This man I missed so much, my best friend, he was there.” Matt played the file at his father’s funeral. Some people were moved, he said, but others felt uncomfortable. “There were people who found it morbid and were scared,” Asner said. “I don’t share that opinion,” he added, “but I can understand them saying that.”
‘A little difficult to watch’
Lynne Nieto understands as well. She and her husband, Augie, founder of Life Fitness, a gym equipment manufacturer, created a StoryFile in February before his death in February from amyotrophic lateral sclerosis (ALS). They thought they could use it on the Augie’s Quest website, the nonprofit organization they founded to raise funds for ALS research. Maybe their grandchildren would want to see it someday.
Nieto saw her husband’s file for the first time about six months after his death. “I’m not going to lie, it was a little difficult to watch,” she said, adding that it reminded her of their Saturday morning talks and was a somewhat intense experience.
Those feelings are not uncommon. These products force consumers to confront what they are programmed to not think about: mortality.
A matter of consent and perspective
Like other AI innovations, chatbots created in the likeness of someone who has died raise ethical questions. Ultimately, it is a matter of consent, said Alex Connock, a professor at the Said Business School at the University of Oxford and author of the book The Media Business and Artificial Intelligence. “Like all ethical lines in AI, it will come down to permission,” he said. “If it’s done knowingly and voluntarily, I think most of the ethical issues can be resolved quite easily.”
The effects on survivors are not as clear. David Spiegel, associate chair of psychiatry and behavioral sciences at the Stanford School of Medicine, said programs like StoryFile and HereAfter AI could help people process grief, like looking through an old photo album. “The crucial thing is to maintain a realistic perspective on what you’re looking at: that it’s not that the person is still alive, communicating with you,” he said, “but you’re seeing what they left behind.”