Thanks for sharing; I thought this was a fascinating read, especially since it ended on a positive note and not pure condemnation. It seems totally Black Mirror-esque, but I do wonder how many of the commentators here attacking it didn't read the article. The family obviously didn't make this decision lightly, given how much work it took to create it, and even the judge appreciated the novel approach. This is probably one of the best-case use scenarios relative to the abyss of unstoppable horror that awaits us.
Perhaps; it seemed like they knew the decedent well enough to know that he would appreciate this, from everything that the article says. With that said, I also won't be surprised if templates for wills or living trusts add a no-duplication statement by default over the coming years.
It was a victim impact statement, not subject to the rules of evidence. The shooter had already been found guilty, and this was an impact statement from the victim’s sister, to sway how the shooter should be sentenced. The victim’s bill of rights says that victims should be allowed to choose the method in which they make an impact statement, and his sister chose the AI video.
I agree that it shouldn’t be admissible as evidence. But that’s not really what’s being discussed here, because it wasn’t being used as evidence. The shooter was already found guilty.
For this ? The guy who was brought back through Ai was killed in a hit and run then they brought the ai version of him to court to give a statement from beyond the grave of sorts. I think it's immoral as fuck but I'm sure I'll get told why it's actually not.
This was played before sentencing. It doesn't say it here, but the article I read earlier today stated that because of this video, the judge issued a sentence greater than the maximum recommended by the State. If true, then it really calls into question the sentence itself and how impartial the judge was.
Oh - then that’s fucked up. Synthesizing some narrative to potentially coerce an outcome seems like a slippery slope. (Not necessarily saying that’s exactly what happened here.)
A victim impact statement is a written or oral statement made as part of the judicial legal process, which allows crime victims the opportunity to speak during the sentencing of the convicted person or at subsequent parole hearings.
From the article (emphasizes mine):
But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey's case.
"Because this is in front of a judge, not a jury, and because the video wasn't submitted as evidence per se, its impact is more limited," she told NPR via email.