cupure logo
trumpgazaukraineputinkilledwarpeaceplansummitzelenskyy

Deepfaked after death: Some don't want it

Deepfaked after death: Some don't want it
Former CNN anchor Jim Acosta's interview with an AI-generated avatar of a Parkland shooting victim has reignited debate on the ethics of creating deepfakes of the dead.Why it matters: As cheap and free generative AI tools become capable of replicating voices, faces and personalities, some people are adding clauses to their wills to prevent the creation of their digital likeness after they die.Catch up quick: Acosta, who's now an independent journalist, aired an interview last week with an AI-generated avatar of Joaquin Oliver, one of the teenagers who was killed at age 17 in the 2018 mass shooting in Parkland, Florida.Viewers found the video disturbing, exploitative and bad journalism in need of an editor. What it wasn't? Illegal.Oliver's father, Manuel Oliver, is the executor of his son's estate, so he can use his son's name, image and likeness (NIL) — including creating an AI version of him.This is known as a post-mortem right of publicity, which is recognized in the state of Florida. How it works: Digital twins are created by uploading photos, videos and writings of a person into a large language model. The models then spit out "twins," which can range from video avatars with audio to text chatbots.Generative AI can roughly simulate tone and personality and predict how a person might respond.The big picture: Celebrities have been planning for what happens to their digital NIL after death at least since rapper Tupac Shakur's hologram posthumously "performed" at Coachella in 2012.But in a world where everyone has an online footprint, it's no longer just a celebrity problem.Case in point: The viral video of two concertgoers from last month's Coldplay concert was quickly fed into AI tools that used the couple's likeness to create deepfakes.State of play: It's easy enough to put a clause in your will stating you don't want to be reanimated by AI."It would let families know the decedent's wishes and obligate the executor to carry them out as best they can," said Denise Howell, a technology lawyer and host of the podcast Uneven Distribution on the Hearsay Culture network.But enforcing that wish could mean expensive lawsuits, especially in states without clear laws on posthumous AI rights."Our right of publicity laws weren't written with this situation in mind or designed to deal with it. They vary from state to state and many states don't have them at all," Howell said.The other side: Not everyone wants to opt out. Chatbots based on a person's likeness are a way that some loved ones grieve. Joaquin's father says he created the AI version of his son both to deal with his loss and also to bring more attention to gun control."If the problem that you have is with the AI, then you have the wrong problem. The real problem is that my son was shot eight years ago." Oliver said in an Instagram video.Follow the money: While few people are planning for posthumous AI rights, many are already building digital versions of themselves to monetize and control now and after death.AI rights management platform Vermillio now offers this service to everyone for free.2wai allows celebrities (and soon everyone else) to create their digital avatars on their phones.But even if you train an AI avatar yourself, it may say things you never would."For me, it's a consent issue," Johnni Medina, manager of content and digital engagement at Pace University, told Axios. "I know how I feel about things. I don't know that my loved ones know exactly how I feel about things." "If I were tragically murdered, I would hate to think that my likeness could be used to advocate for the death penalty for my aggressor," they said.In May, the sister of a man who was killed in a road rage incident used AI to generate a video of her brother giving a victim impact statement. The judge told the family, "I loved that AI. Thank you for that" before sentencing the man to 10.5 years for manslaughter.

Comments

World news