Slash Boxes

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by hubie on Saturday September 09 2023, @01:42AM   Printer-friendly
from the think-of-the-AI-generated-children dept.

On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM). They also call for expanding existing laws against CSAM to explicitly cover AI-generated materials.

"As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions," the letter reads. "And while Internet crimes against children are already being actively prosecuted, we are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult."

In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation's top prosecutors. (It's worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)

"Creating these images is easier than ever," the letter reads, "as anyone can download the AI tools to their computer and create images by simply typing in a short description of what the user wants to see. And because many of these AI tools are 'open source,' the tools can be run in an unrestricted and unpoliced way."

As we have previously covered, it has also become relatively easy to create AI-generated deepfakes of people without their consent using social media photos.

Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Interesting) by pTamok on Saturday September 09 2023, @05:36PM (1 child)

    by pTamok (3042) on Saturday September 09 2023, @05:36PM (#1323881)

    No child was involved in the creation of that image just as no person was actually shot by my own sketch. Should an image of a sexual assault mean that somebody has attempted to commit rape? Or is it only because the image involves an imaginary child?

    I think an issue here is that we don't want to normalise the idea that sexual activity with a participant that, by legal definition, cannot consent is OK.

    But you raise a good point with regards to rape-porn. I believe it is an actual genre. But as the (presumably adult) participants creating the (fictional) depiction of rape are cable of consenting to the activity, then the play-acting in itself is not illegal, even though it is depicting an illegal act. Not that I think it is healthy, but that's just my opinion. I do know that some porn actors have made statements to the effect that the recorded scenes were in fact non-consensual*. Which is a problem.

    Obviously people don't tend to consent to being murdered, or tortured, but Hollywood has no problem making entertainment media that contain scenes of both. So it is legal to depict illegal acts that are non-consensual. Usually, there is a moral justification for the good guys killing the bad guys, and the bad guys kill because they are, well, bad. This might explain why it is OK. I don't see a moral justification for sexual abuse of minors that can't consent, and therefore no moral justification for depictions of such abuse. If you could argue that looking at depictions of sexual abuse of minors would reduce the actual sexual abuse of minors, you might be able to make a moral justification for producing those images, but I think that is a pretty hard thing to demonstrate.

    *For example, Linda Lovelace, the actress in Deep Throat, made a statements to that effect []. Many others have made similar statements.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   2  
  • (Score: 2) by Opportunist on Monday September 11 2023, @05:17PM

    by Opportunist (5545) on Monday September 11 2023, @05:17PM (#1324093)

    Fictional characters cannot consent nor not consent. They will do whatever their creator makes them do.

    Worse, fictional characters do not have to conform to reality altogether. This here is Ixi. Ixi is a fictional character I just created. Ixi looks like an 8 year old but she only looks it, actually she's 20 years old and thus an adult and can consent. Actually, she will consent to anything because that's how I created her, she's fictional and as such, she is whatever I make her.

    What now?