An AI-generated video of a dead victim was used in an Arizona court case, raising questions about how the tech’s use might spread
Christopher Pelkey's relatives created the video to make it appear as if he was addressing the court. Pennsylvania court officials said they knew of no comparable example here.

Inside an Arizona courtroom earlier this month, a judge allowed what may have been a nationwide first: live testimony — of a sort — from a dead man.
During a sentencing hearing for a man convicted of manslaughter in the shooting death of Christopher Pelkey, the victim’s relatives played a video for the judge that was generated by artificial intelligence and designed to make it appear as if Pelkey was addressing the court — and, at times, speaking directly to his convicted shooter.
“It is a shame we encountered each other that day in those circumstances,” said Pelkey’s avatar, which recited a script written by Pelkey’s sister and used an image of his face and a voice to mimic the one he had before he died. “In another life, we probably could have been friends.”
Pelkey, 37, a U.S. Army veteran, was killed in a road-rage attack in November 2021, and Pelkey’s sister, Stacey Wales, said she wanted to deliver a victim- impact statement at the shooter’s sentencing that would allow the judge to learn about her brother’s humanity.
She and her husband, who works in tech, created the video, which she said really felt like “a true representation of the man we knew.”
“We had one goal, which was to humanize Chris and to make a judge feel, and I believe that we were successful in doing so,” she told USA Today.
Legal observers say it was likely the first time that AI had been used in such a fashion inside a United States courtroom.
Now there are questions about how or whether its use may grow.
In Pennsylvania, court officials said they knew of no comparable use of AI in any state cases. Last year, the Pennsylvania and Philadelphia Bar Associations issued a joint formal opinion on how lawyers can use AI ethically in the courts, though it did not include guidance addressing the specific application in Pelkey’s case.
At least one judge in Philadelphia’s federal courthouse issued a standing order in 2023 requiring attorneys to disclose any time they use AI in cases before him. And a federal judicial panel earlier this month approved a measure to potentially regulate the use of AI-generated evidence at trial.
Jules Epstein, a professor at Temple University’s Beasley School of Law, said victims or their relatives have a right in Pennsylvania’s state courts to provide written statements or testimony before sentencing. And judges have the discretion to permit those witnesses to bring supplementary materials to the proceedings, such as photos or other keepsakes — meaning an AI-generated video could be viewed as an extension of that type of material.
But Epstein said judges also have the ability to exclude such videos and instead ask witnesses to simply tell the court what victims might have said if they could have been present.
If a judge does decide to allow a video to be played, Epstein said, “the assumption should be that a judge who has experience can separate the emotionality” from the message it contains, and not let it or any other materials interfere with the duty to impose a fair sentence.
In addition, Epstein said, the analysis of whether to allow such a video to be played “would be very different if this were any kind of proceeding in front of a jury.”
In Pelkey’s case, an attorney for the defendant, Gabriel Horcasitas, quickly appealed the 10½-year sentence the judge imposed May 1, saying the use of the video “just felt wrong on many levels,” according to an ABC affiliate in Arizona.
“This may be a situation where they just took it too far,” the attorney, Jason Lamm, told the New York Times, “and an appellate court may well determine that the court’s reliance on the A.I. video could constitute reversible error and require a resentencing.”