Breaking News

When a journalist uses AI to interview a dead child, isn’t it time to ask what the limits should be? | Hinsliff Gaby

JOaquin Oliver was 17 years old when he was shot in the corridor of his high school. An older adolescent, expelled a few months earlier, had opened fire with a high power rifle on Valentine’s Day in what has become shot in the deadliest high school in America. Seven years later, Joaquin says it thinks it is important to talk about what happened that day in Parkland, Florida, “so that we can create a safer future for everyone”.

But unfortunately, what happened to Joaquin that day is that he died. The strangely metallic voice speaking to the former journalist of the CNN Jim Acosta in an interview on Sub -Sack this week was in fact that of a digital ghost: an AI, trained on the old publications of teenage social media at the demand of his parents, who use it to strengthen their campaign for stricter firearms controls. Like many bereaved families, they have told the story of their child many times. No wonder they are desperately shooting each possible lever now, wondering what it takes to make children who died in Washington heard.

But they also wanted his father, Manuel, simply admits hearing the voice of their son again. His wife, Patricia, spends hours asking IA questions, listening to him saying: “I love you, mom.”

No parent in his good direction would never judge a bereaved. If it is a comfort to keep the room of the lost child as a sanctuary, talk to their tombstone, sleep with a t-shirt that still smells slightly like them, then it is not a matter of anyone else. People hold what they can. After September 11, families listened to the bands until the bands are physically exhausted from the messages of responses left by dear beings, calling at home to say goodbye burning towers and diverted planes. I have a friend who always reread the exchanges of WhatsApp with her late sister, and another who sometimes sends an SMS to her late father with family extracts from the family: she knows that he is not there, of course, but is not quite ready to put an end to the conversation. Some people even pay mediums to commune, in suspicious platitudes, with the dead. But it is precisely because it is so difficult to let go that sorrow is vulnerable to exploitation. And there could soon be a large business by digitally bringing the dead.

As with the video generated by the Mawkish Ai Rod Stewart Played on stage this week, with the late Ozzy Osbourne greeting various legends of dead music, which could mean a little more than glorified memes. Or this could be for temporary purposes, such as the Avatar of the AI recently created by the family of a ball victim in Arizona to address the judge during the sentence of the shooter. But over time, it can be something more deeply difficult for the ideas of individuality and mortality. What if it was possible to create a permanent AI replica from someone who was dead, perhaps in the form of a robot, and to continue the conversation with them forever?

An image of IA of Ozzy Osbourne and Tina Turner presented during a concert by Rod Stewart in the United States, August 2025. Photography: Iamsloanesteel Instagram

The resurrection is a divine power, not to give up slightly to a technological brother with a Messiah complex. But while the legal rights of the living not to fly their identities for use in Deepfakes AI become more established, the rights of the dead are confused.

The reputation dies with us – the dead cannot be disseminated – while DNA is posthumously protected. (The 1996 birth of Dolly The Sheep, a genetic clone copied from a single cell, has triggered global prohibitions on human cloning.) The law governs the respectful elimination of human tissues, but these are not bodies on which AI will be formed: these are private voicides and messages and images of what matters to a person. When my father died, personally, I never felt that he was really in the coffin. He was so obviously more obviously found in the boxes of his old letters, the garden he planted, the recordings of his voice. But everyone cries differently. What happens if half of a family wants Mom to resuscitate digitally, and that the other half does not want to live with ghosts?

The fact that the Joaquin Oliver Ai can never grow – that he will be 17 years old, trapped in the amber of his adolescent social media character – is ultimately the fault of his killer, not that of his family. Manuel Oliver says that he knows very well that the avatar is not really his son and that he does not try to bring him back. For him, it seems more a natural extension of the way in which the family campaign already evokes the history of Joaquin’s life. However, there is something disturbing in the plan to give its AI access to a social media account, to download videos and win subscribers. What if it starts to hallucinate or aim at subjects where he cannot know what the real Joaquin would have thought?

Although for the moment, there is a revealing problem on AI avatars, as technology improves it, it can become more and more difficult to distinguish them from real humans online. Perhaps it will not be long before companies or even government agencies already use chatbots to deal with customer requests start to wonder if they could deploy public relations avatars to answer questions from journalists. Acosta, a former correspondent of the White House, should probably have known better than to blur the already dirty waters in a post-truth world by agreeing to interview someone who does not exist technically. But for the moment, the most obvious risk is perhaps conspiracy theorists citing this interview as “proof” that any difficult story to their beliefs could be a hoax, the same disturbed lie famous by the host of Infowars Alex Jones about the shootings of Sandy Hook School.

However, the professional challenges involved here are not only for journalists. As the AI evolves, we will all live more and more with synthetic versions of ourselves. It will not only be a relatively primitive Alexa in your kitchen or chatbot in your laptop – although there are already stories of anthropomorfising IA people or even falling in love with chatgpt – but something much more finely adapted to human emotions. When a in 10 British adult tells researchers that they have no close friends, of course, there will be a market for AI companions, just as there are today to get a cat or scroll through foreigners on Tiktok.

Perhaps, as a company, we will finally decide that we are comfortable with technology to meet the needs of people when other humans have unfortunately did not do so. But there is a big difference between evoking a comforting generic presence for the solitary and awakening the dead to be ordered, a loved one lost at the same time. There is a time to be born and a moment to die, according to the verse, so often read during the funeral. How will that change us as a species, when we no longer know which one is?

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button