On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM). They also call for expanding existing laws against CSAM to explicitly cover AI-generated materials.
"As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions," the letter reads. "And while Internet crimes against children are already being actively prosecuted, we are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult."
In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation's top prosecutors. (It's worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)
"Creating these images is easier than ever," the letter reads, "as anyone can download the AI tools to their computer and create images by simply typing in a short description of what the user wants to see. And because many of these AI tools are 'open source,' the tools can be run in an unrestricted and unpoliced way."
As we have previously covered, it has also become relatively easy to create AI-generated deepfakes of people without their consent using social media photos.
(Score: 5, Insightful) by Username on Saturday September 09 2023, @08:49AM (3 children)
I think the problem is that investigators cannot tell the difference between real and ai and don't want to investigate each image and just want to declare it all real.
(Score: 3, Touché) by driverless on Saturday September 09 2023, @10:31AM (1 child)
Well that's not hard to do, if your nonexistent-child image has six fingers, legs with three joints, and a mouth with 90 degree angles in it, then it's AI-generated.
That's actually another problem, AI generates a lot of anime-style doll-like faces, how is anyone going to be able to tell whether the doll-face attached to the 38DDs is a nonexistent child or a nonexistent adult? Can't these people get back to arguing over the satanic symbolism of pizza slices or something?
(Score: 2, Insightful) by Anonymous Coward on Saturday September 09 2023, @12:37PM
We already see AI-generated images that are much better from the latest models. The anatomy needs less or no correction, and they can be much harder to distinguish from photographs.
(Score: 3, Insightful) by Immerman on Saturday September 09 2023, @02:23PM
Not really a problem - possessing child porn is already illegal in the US, even if it's hand drawn cartoons.
This law is specifically about requiring companies to prevent their AI from generating such images... somehow.
Shall we next pass a law requiring pencil makers to prevent their pencils from being used to draw child porn?