AI lets murder victim address his killer in Arizona courtroom

midian182

Posts: 10,648   +142
Staff member
What just happened? We've seen stories in the past of people using artificial intelligence to have conversations with deceased loved ones – or at least the system's interpretation of their personality. Now, AI technology has been used so a man who was murdered in a road rage incident in 2021 could address his killer in court.

37-year-old army veteran Christopher Pelkey was killed by Gabriel Horcasitas at a red light in 2021 in Chandler, Arizona. Pelkey had left his vehicle and was walking back toward Horcasitas' car when he was shot.

In what is believed to be the first use of AI to deliver a victim statement, a lifelike simulacrum of the deeply religious Pelkey addressed the man who killed him in an Arizona court.

"To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," said Pelkey. "In another life, we probably could have been friends."

"I believe in forgiveness, and a God who forgives. I always have, and I still do."

Stacey Wales, Pelkey's sister, came up with the idea to use AI in this way as she collected victim impact statements and prepared her own.

"We received 49 letters that the judge was able to read before walking into sentencing that day. But there was one missing piece. There was one voice that was not in those letters," she said. "All I kept coming back to was, what would Chris say?"

Wales poses with the photo of her brother on which the AI-generated video is based (credit: Fox 10)

Unlike other instances of generative AI being used to speak to deceased individuals, Wales wrote the script that her brother delivered. The technology was used to create a video of an older version of Pelkey, based on a photograph provided by the family, and put the words into his mouth, making this more like a deepfake – albeit one created for a good cause.

This was one of the rare cases where a judge welcomed the use of AI in a courtroom. Judge Todd Lang said "I loved that AI, thank you for that. As angry as you are, as justifiably angry as the family is, I heard the forgiveness." Pelkey's brother John was equally pleased, saying that seeing his brother's face made him feel "waves of healing."

Lang sentenced Horcasitas to 10-and-a-half years in prison on manslaughter charges.

Most of the instances of AI being used in courtrooms haven't gone well. Back in 2023, what was set to be the first case of an AI "robot lawyer" used in a court of law never materialized after the CEO behind it was threatened with jail time.

There have also been several instances of human lawyers using generative AI to file briefs containing nonexistent cases. A case this year led to a $15,000 fine for the lawyer involved. In June 2023, two lawyers and their law firm were fined $5,000 by a district judge in Manhattan for citing fake legal research generated by ChatGPT.

Permalink to story:

 
Wow. So on the list of "things that no one cares about," can be barrow a rocket from Elon to shoot this to the top?
 
Next step is to have AI replace judges and lawyers. Once lawyers are gone from this world, then have AI replace politicians. Maybe then we will have peace. Maybe there will be no humans left though, but, like, f-k it, better off. (The Architect)

P.S. They should have trained the AI model with speeches from South Park, to open the proceedings with "Oh my God, they killed Kenny! You bastards!"
 
Last edited:
This doesn't seem like it would be legal, it seems like it would be more inflammatory against the defendant creating a fake victim avatar and putting words the victim never spoke into the avatar's mouth. Impact statements by live people or someone that has immortalized their thoughts on the matter in paper, video or audio prior to their death is acceptable, but this seems like it is designed to tug on the heart strings than having any real substantively legal value.

I think they should expect an appeal if the video possible influenced the sentence handed down.
 
Odd that an AI Puppet Show is allowed like this. They've written fanfiction of the deceased and used it to increase a person's sentence.
 
I have a fundamental dislike of “impact statements”… crime should be about intent and action…

if I willfully go and murder someone, should I get less time in jail because the guy I murdered had no one who cared about him?

I think for the most part impact statements are more for the families of the victims (closure) than they really are at determining a person sentence. Though they may be more likely to sway a jury at sentencing towards the maximum than a judge handing down the sentence.

At one point (I believe prior to the 80s or 90s) victim impact statements were not allowed, then a case in Texas set a precedent by allowing one and more states added laws allowing them.
 
Very concerning if this is presented in such a way that it may influence a verdict or sentencing. Where does this end, will we next have AI generated victim statements from the deceaseds unconceived children.
 
The number of commenters that obviously didn't read the article... -SMH-

This is very weird. But allowing a victim to offer forgiveness to their killer is obviously about the family getting closure/healing so I guess that's fine. Still a family member reading that statement seems less weird.
 
I have a fundamental dislike of “impact statements”… crime should be about intent and action…

if I willfully go and murder someone, should I get less time in jail because the guy I murdered had no one who cared about him?
No, and I don't think you would. I don't think that's what the impact statements are really for even if they give the victim's family that impression. Whenever I hear descriptions of sentencing guidelines there's never one like "and add 1 year for each person who gave an impact statement".

But whenever there are people who cared about the victim, then yes the system can and should spare them a few minutes to speak at this proceeding and tell the perpetrator whatever they wish to say to him.

Here it's a little weird in that the AI recreation is really someone else speaking through him. I'd have a big problem if this was offered in the evidence portion of the trial because it's not evidence. But here in sentencing, where to your point its not actually that relevant to the judge's decision anyway, I guess if it helps anyone heal or feel closure than little enough harm done.
 
No, and I don't think you would. I don't think that's what the impact statements are really for even if they give the victim's family that impression. Whenever I hear descriptions of sentencing guidelines there's never one like "and add 1 year for each person who gave an impact statement".

But whenever there are people who cared about the victim, then yes the system can and should spare them a few minutes to speak at this proceeding and tell the perpetrator whatever they wish to say to him.

Here it's a little weird in that the AI recreation is really someone else speaking through him. I'd have a big problem if this was offered in the evidence portion of the trial because it's not evidence. But here in sentencing, where to your point its not actually that relevant to the judge's decision anyway, I guess if it helps anyone heal or feel closure than little enough harm done.
I think for the most part impact statements are more for the families of the victims (closure) than they really are at determining a person sentence. Though they may be more likely to sway a jury at sentencing towards the maximum than a judge handing down the sentence.

At one point (I believe prior to the 80s or 90s) victim impact statements were not allowed, then a case in Texas set a precedent by allowing one and more states added laws allowing them.
I understand they’re for the families of the victim(s)… but they should be read AFTER sentencing. Having them impact someone’s sentence just seems wrong.
All lives SHOULD be equal… if I murder a homeless man, I should receive the same sentence as murdering a well-loved rich dude…
 
I get there is some reflexive hate towards AI but this is legitimately a brilliant use if channeled correctly.

Obv these statements happen after the guilty has already been tried and sentenced, so they don't have a direct impact on the trial, but this seems like a really interesting use case for rehabilitation/punishment.

Daily/weekly visits with the person you killed, built off their digital fingerprint and allowed to grow organically (in a positive way), so the guilty gets a constant reminder of the life they stole. It's not a one and done thing, you're not allowed to let time heal or just forget or lie to yourself to make your actions seem more bearable.

Live with a simulacra of the person whose life you stole every day for the remainder of your sentence. Have that person talk to you about the kids they will never see grow up and who will never have one of their parents, the spouse who struggles to pay the bills because they're gone, the trail of wreckage left over time and space because of the guilty's actions.
 
I'm more concerned that someone can decide to shoot someone who (I gather) is walking back to discuss a traffic accident, and they only get 10.5 years jail, maybe with the possibility of earlier parole.
 
I get there is some reflexive hate towards AI but this is legitimately a brilliant use if channeled correctly.

Obv these statements happen after the guilty has already been tried and sentenced, so they don't have a direct impact on the trial, but this seems like a really interesting use case for rehabilitation/punishment.
they don’t though… impact statements are read BEFORE sentencing and often influence the length of the sentence.
 
I've read the entire article. I still think this is completely sick and f*cked and twisted. Even more absurd that the idea came from the victim's own siblings, and a judge thought this was acceptable... shows that modern society really isn't ready for AI.

It's not the victim's words, it never will be. No one can speak for a deceased person. Even in one of those sci-fi scenarios where someone could be brain scanned before or even soon after their death, and have their entire personality, thought patterns and memories fed to an AI (like in CP2077 for example), it still isn't the deceased person who's speaking.

(I think the trend of bringing dead actors back to life using AI to play new roles very sickening and disrespectful as well, unless they did personally consent to this while still living, in that case I still think it's kinda f*cked up but at least not sickening. And no, their families consenting to it doesn't count.).

Those of us who don't appreciate these sorts of things, looks like we will have to write in our wills that we will not want to have AI simulacras of us being created and used under any circumstances after we go.

Me and my family lost someone very close and very dear to us less than 2 weeks ago, I'm still in mourning and grief, and going through the stages of grief. These thoughts about AI did come up, but I'd never want any of that. Dead people are dead people.
 
I have a fundamental dislike of “impact statements”… crime should be about intent and action…

if I willfully go and murder someone, should I get less time in jail because the guy I murdered had no one who cared about him?
According to a typical modern liberal, 1000 times yes.
 
Back