Former CNN personality Jim Acosta opened his show on SubStack Monday promising a discussion about Texas redistricting and the midterm elections.
“But first,” Acosta said, “today is August the fourth. That happens to be the birthday of my first guest: Joaquin Oliver, (who) died in Parkland school shooting in Florida back in 2018. ...”
What followed was a deeply unsettling tableau enabled by artificial intelligence, gun politics and humanity’s collective helplessness in the face of deep grief.
Acosta’s “guest” was an AI avatar, programmed to look and sound like Oliver, who was 17 when he died outside his creative writing class at Marjory Stoneman Douglas High School. His parents, Manuel and Patricia Oliver, have spent the past seven years campaigning against gun violence, and they believe that the AI version of their son will help in this effort.

Acosta encouraged this thinking, telling the father, “We’ve heard from the (Parkland) parents. We’ve heard from the politicians. Now we’re hearing from one of the kids. That’s important. That hasn’t happened.”
It was the most unsettling moment in a segment that was deeply unsettling from beginning to end — from Acosta introducing the avatar as his guest to Oliver’s father revealing his vision for the future, in which his son’s avatar will be on a stage during a debate, and will have followers, presumably on social media. “This is his first interview,” Manuel Oliver said with pride.
Acosta described Manuel Oliver as a good friend, which might help to explain why he agreed to do this strange and stilted exchange, surely knowing that it would be controversial, even though the subjects they talked about were largely benign. Acosta and the avatar at one point segued into a discussion about their favorite movies and athletes, neither offering anything beyond the most banal of small talk.
Even the answer to Acosta’s most poignant question — “I’m wondering if you could tell me, what happened to you?” — was reduced to generalities, with the avatar responding as if with talking points:
“I appreciate your curiosity. I was taken from this world too soon due to gun violence while at school. It’s important to talk about these issues so we can create a safer future for everyone.”
Those of us who have never suffered the loss of a child have no standing to judge those who have. And if the avatar brings comfort to Joaquin’s parents, then for them, the technology is a blessing. (At one heartbreaking moment in the segment, Manuel said that his wife loves hearing the avatar say, “I love you, Mommy.”)
But the use of this technology in this way presents a gauntlet of ethical issues, especially when the deceased person was a minor, as Joaquin was.
Joaquin may well have agreed with everything his avatar says had he lived to be 25 or 55. He may even have agreed with Acosta that the avatar is “a symbol of something that is deeply, deeply wrong with this country.” His father has defended the segment, saying that artificial intelligence didn’t kill his son, an AR-15 did.
But the dead cannot consent, and as Alissa Wilkinson wrote in her New York Times review of “Eternal You,” a documentary about AI models trained to mimic the deceased, “Those tools can be comforting, but they’re also potentially big business.”
This sort of business is sometimes called “grief tech‚” and families are exploring options to help them prepare for, or cope with, the loss of a loved one.
Still, the generally horrified reaction to Acosta‘s segment shows that the future got here sooner than we were expecting, and we haven’t yet worked through the ethical, philosophical and, yes, the spiritual dimensions of this industry.
“If this isn’t the making of a graven image, then I don’t know what is,” Glenn Beck said on his talk show Tuesday.
Way back in January, Albert Mohler Jr., the president of The Southern Baptist Theological Seminary, talked about the technology on his podcast, saying, “Just consider the potential manipulation of all of this.”
We don’t have to consider it. We’ve now seen it on our social media feeds.

