A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography.
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
This creates a significant legal issue - AI generated images have no age, nor is there consent.
The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.
How do you define what's depicting a fictional child? Especially without including real adults? I've met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.
Even the extremes aren't clear. Adult star "Little Lupe", who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there's full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?
Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go "ok, itch scratched", and tank the demand for the real stuff.
Depending on which way it goes, it could be massively helpful for protecting kids. I just don't have a sense for what the effect would be, and I've never seen any experts weigh in.
Could this be considered a harm reduction strategy?
Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?
I've read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it's such a problem.
To be clear, I am happy to see a pedo contained and isolated from society.
At the same time, this direction of law is something that I don't feel I have the sophistication to truly weigh in on, even though it invokes so many thoughts for me.
I must admit, amount of comments that are defending AI images as not child porn is truly shocking.
In my book, sexual images of children are not okay, AI generated or otherwise. Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.
I truly do believe that AI images should be subject to same standards as regular images in what content we deem appropriate or not.
Yes, this can be used to wrongfully prosecute innocent people, but it does not mean that we should freely allow AI-CP.
If this thread (and others like it) have taught me aulnything is that facts be damned, people are opinionated either way. Nuance means nothing and it's basically impossible to have a proper discussion when it comes to wedge issues or anything that can be used to divide people. Even if every study 100% said Ai generated csam always led to a reduction in actual child harm and reduced recidivism and never needed any actual real children to be used as training material, the comments would still pretty much look the same. If the studies showed the exact opposite, the comments would also be the same. Welcome to the internet. I hope you brought aspirin.
Show me multiple (let's say 3+) small-scale independent academic studies or 1-2 comprehensive & large academic studies that support one side or another and I may be swayed, Otherwise I think all that is being accomplished is that one guys life is getting completely ruined for now and potentially forever over some fabrications and as a result he may or may not get help, but I doubt he'll be better off.
—My understanding was that csam has it's legal status specifically because there are victims that are hurt by these crimes and possession supports a broader market that faciltates said harm to these victims. It's not as easy to make a morality argument (especially a good one) for laws that effect everybody when there are no known victims.