Judge Allows AI-Generated Victim Statement in Arizona Murder Case
In a courtroom first that could reshape how technology intersects with justice, an Arizona judge allowed a murder victim to deliver his own victim impact statement — four years after his death. The message came not from the grave, but through an artificial intelligence–generated video that recreated the likeness and voice of Christopher Pelkey, a 37-year-old military veteran killed in a 2021 road rage shooting.
A Road Rage Confrontation Turns Deadly
Pelkey, described by friends and family as a loving father and forgiving soul, was shot and killed after approaching the car of Gabriel Paul Horcasitas during a heated traffic altercation in Chandler, Arizona. Horcasitas fired his weapon, later claiming he acted in fear.
After a lengthy trial, Horcasitas was convicted of manslaughter. At his September 2025 sentencing in Phoenix, the court prepared to hear dozens of victim impact statements from Pelkey’s friends and family. But one statement stood apart: Pelkey himself, digitally resurrected.
The AI Video Message
Using a single photo and video fragments of Pelkey, his sister Stacey Wales, and her husband worked with a friend skilled in AI avatar creation to build a moving, speaking likeness. The AI-generated Pelkey appeared on screen, addressing Horcasitas directly:
He lamented the tragic way they had met, spoke of his deep belief in forgiveness, and urged others to cherish life and love one another. The words, Wales explained, reflected her brother’s true character — a man who would never have wanted bitterness to define his legacy.
“I couldn’t find the words,” Wales said. “So I let Chris speak for himself.”

Judge’s Reaction and Sentencing
Maricopa County Superior Court Judge Todd Lang allowed the video to be played before handing down a 10.5-year prison sentence. The family had requested the maximum penalty, but Lang noted that the AI statement seemed to align with the person described in nearly 50 letters submitted on Pelkey’s behalf.
“Even though that’s what you wanted, you allowed Chris to speak from his heart as you saw it,” Lang told the family in court.
Support and Skepticism
For Pelkey’s family, the video was a source of peace. “Everybody knew that Chris would forgive this person,” Wales said afterward, adding that the AI statement gave her brother a voice in the courtroom that felt true to his spirit.
But not everyone was convinced. Horcasitas’s attorney, Jason Lamm, criticized the video as “cringe” and filed notice of appeal, arguing that the judge may have improperly considered the AI-generated statement in sentencing.
Legal scholars also raised red flags. Gary Marchant, a law professor at Arizona State University, noted that while this case was rooted in authentic testimony from loved ones, AI deepfakes could easily be abused. “There’s a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used,” he warned.
A Legal First With Broader Implications
Arizona law permits victims to deliver impact statements in digital formats, from letters to videos. But this marks the first known case in the United States where AI was used to recreate a deceased victim’s likeness and voice for such a statement.
The Arizona Supreme Court has since formed a committee to study AI’s role in the courts, reflecting the promise and the risks of this rapidly advancing technology.
For now, the case of Christopher Pelkey stands as a landmark moment that gave a grieving family comfort, but also opened a Pandora’s box of ethical and legal questions about authenticity, manipulation, and the future of justice in the age of artificial intelligence.

