On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM). They also call for expanding existing laws against CSAM to explicitly cover AI-generated materials.
"As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions," the letter reads. "And while Internet crimes against children are already being actively prosecuted, we are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult."
In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation's top prosecutors. (It's worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)
"Creating these images is easier than ever," the letter reads, "as anyone can download the AI tools to their computer and create images by simply typing in a short description of what the user wants to see. And because many of these AI tools are 'open source,' the tools can be run in an unrestricted and unpoliced way."
As we have previously covered, it has also become relatively easy to create AI-generated deepfakes of people without their consent using social media photos.
(Score: 2, Interesting) by pTamok on Saturday September 09 2023, @04:29PM (2 children)
You have no way of knowing if it has been drawn from life, or a from photo or video of a real record of abuse.
And because you have no way of knowing, you have to assume the worst, because if you don't, you've just created a market. Sure, every image creator will say it was drawn entirely from their imagination. Of course they would, wouldn't they?
Similarly, a disclaimer, like you find in fictional texts, saying that 'any resemblance between the fictional character and someone real is entirely accidental' doesn't particularly help if it looks like a real child. Of course they'll have a disclaimer. They would say that, wouldn't they?
You can't prove a negative. You can't prove it is not-real. The only winning move is not to play.
(Score: 1) by khallow on Sunday September 10 2023, @12:53AM
Why do you have to "assume the worst"?
How about innocent until proven guilty? I think that's a more winning move.
(Score: 3, Insightful) by Opportunist on Monday September 11 2023, @05:02PM
So given Stephen King, I should assume he has done all the things he writes about?
What about Agatha Christie, should I assume she murdered a couple hundred people?
Why the hell should I "assume the worst"? Oh, right, "think of the children"
If you think of the children all the time, my first guess would be that you're a pedo.