On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM). They also call for expanding existing laws against CSAM to explicitly cover AI-generated materials.
"As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions," the letter reads. "And while Internet crimes against children are already being actively prosecuted, we are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult."
In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation's top prosecutors. (It's worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)
"Creating these images is easier than ever," the letter reads, "as anyone can download the AI tools to their computer and create images by simply typing in a short description of what the user wants to see. And because many of these AI tools are 'open source,' the tools can be run in an unrestricted and unpoliced way."
As we have previously covered, it has also become relatively easy to create AI-generated deepfakes of people without their consent using social media photos.
(Score: 2) by NotSanguine on Wednesday September 13 2023, @07:25PM
Came over here from the "deadly Dorito" article based on your link.
The argument you're making was made WRT Lolita [wikipedia.org] too. As well as many of Judy Blume's [wikipedia.org] books.
Shall we ban those too?
I'd also note that much of the the medical/psychiatric community considers pedophilia to be a sexual orientation [nih.gov] just like heterosexuality or homosexuality. As such, you won't "create" new pedophiles with fictional accounts, art, etc. whether it's AI generated or not.
That said, understanding this doesn't make acting on such desires appropriate, as prepubescent people don't have the emotional development/capacity to meaningfully consent to such activities.
I agree that it's important to take action (laws and enforcement of same as well as treatment to address those with such an orientation) to prevent the abuse and exploitation of children, but fiction is fiction in whatever media -- and no child is harmed by fictional accounts.
Video games, music and other media have also seen demands for it to be banned because "think of the children." Which is a bunch of bullshit.
N.B.: I have no sexual interest in the prepubescent and am horrified by abuse/exploitation, sexual or otherwise, of kids.
No, no, you're not thinking; you're just being logical. --Niels Bohr