What just happened? We've seen stories in the past of people using artificial intelligence to have conversations with deceased loved ones – or at least the system's interpretation of their personality. Now, AI technology has been used so a man who was murdered in a road rage incident in 2021 could address his killer in court.
37-year-old army veteran Christopher Pelkey was killed by Gabriel Horcasitas at a red light in 2021 in Chandler, Arizona. Pelkey had left his vehicle and was walking back toward Horcasitas' car when he was shot.
In what is believed to be the first use of AI to deliver a victim statement, a lifelike simulacrum of the deeply religious Pelkey addressed the man who killed him in an Arizona court.
"To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," said Pelkey. "In another life, we probably could have been friends."
"I believe in forgiveness, and a God who forgives. I always have, and I still do."
Stacey Wales, Pelkey's sister, came up with the idea to use AI in this way as she collected victim impact statements and prepared her own.
"We received 49 letters that the judge was able to read before walking into sentencing that day. But there was one missing piece. There was one voice that was not in those letters," she said. "All I kept coming back to was, what would Chris say?"
Wales poses with the photo of her brother on which the AI-generated video is based (credit: Fox 10)
Unlike other instances of generative AI being used to speak to deceased individuals, Wales wrote the script that her brother delivered. The technology was used to create a video of an older version of Pelkey, based on a photograph provided by the family, and put the words into his mouth, making this more like a deepfake – albeit one created for a good cause.
This was one of the rare cases where a judge welcomed the use of AI in a courtroom. Judge Todd Lang said "I loved that AI, thank you for that. As angry as you are, as justifiably angry as the family is, I heard the forgiveness." Pelkey's brother John was equally pleased, saying that seeing his brother's face made him feel "waves of healing."
Lang sentenced Horcasitas to 10-and-a-half years in prison on manslaughter charges.
Most of the instances of AI being used in courtrooms haven't gone well. Back in 2023, what was set to be the first case of an AI "robot lawyer" used in a court of law never materialized after the CEO behind it was threatened with jail time.
There have also been several instances of human lawyers using generative AI to file briefs containing nonexistent cases. A case this year led to a $15,000 fine for the lawyer involved. In June 2023, two lawyers and their law firm were fined $5,000 by a district judge in Manhattan for citing fake legal research generated by ChatGPT.
AI lets murder victim address his killer in Arizona courtroom