Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Saturday September 09 2023, @01:42AM   Printer-friendly
from the think-of-the-AI-generated-children dept.

https://arstechnica.com/information-technology/2023/09/ai-generated-child-sex-imagery-has-every-us-attorney-general-calling-for-action/

On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM). They also call for expanding existing laws against CSAM to explicitly cover AI-generated materials.

"As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions," the letter reads. "And while Internet crimes against children are already being actively prosecuted, we are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult."

In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation's top prosecutors. (It's worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)

"Creating these images is easier than ever," the letter reads, "as anyone can download the AI tools to their computer and create images by simply typing in a short description of what the user wants to see. And because many of these AI tools are 'open source,' the tools can be run in an unrestricted and unpoliced way."

As we have previously covered, it has also become relatively easy to create AI-generated deepfakes of people without their consent using social media photos.


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Opportunist on Saturday September 09 2023, @12:50PM (12 children)

    by Opportunist (5545) on Saturday September 09 2023, @12:50PM (#1323835)

    So a picture is drawn of a child being abused. Unless that picture is drawn of an actual child being abused, who exactly is the victim? Who is the law allegedly trying to protect here?

    What's next, arresting Stephen King for writing stories about ... I mean, have you read some of his stories? That guy's a sicko, lock him up, what's going on in his head is clearly not ok.

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Saturday September 09 2023, @02:15PM (8 children)

    by Anonymous Coward on Saturday September 09 2023, @02:15PM (#1323853)

    Haven't you seen artists and authors get angry about their work being included in the gigantic training sets?

    Well, clothed and nude children can also be included in smaller training sets.

    https://civitai.com/models [civitai.com]

    There's a vibrant community creating and mixing models, using their own training data. Look at the civitai link to see the typical subject matter. Notice that some of the models are aiming for photorealism.

    Now imagine there are people are out there using actual child pornography to train models. These models are files containing no images, but can be used to generate images of specific children endlessly.

    Imagine that this has been happening on imageboards and the Fediverse for over a year now.

    • (Score: 1) by pTamok on Saturday September 09 2023, @04:17PM (7 children)

      by pTamok (3042) on Saturday September 09 2023, @04:17PM (#1323873)

      Given I've never heard of civitai before, in the context of this discussion, I'm not going to click on that link.

      Knowing that web-browsers do pre-fetching of links on pages, I'm now worried about what might already have been downloaded into my local cache.

      And if you routinely run javascript, you have no idea what advertising networks could be downloading in the background. Remember too, they are not immune to compromise.

      It is all too easy to end up with illegal-to-possess images on your storage media without necessarily knowing about it. Being investigated for that kind of stuff is 'somewhat' disruptive to your life, as related by (innocent) people who have gone through it.

      One could get depressed about the human condition.

      • (Score: 0) by Anonymous Coward on Saturday September 09 2023, @05:40PM (3 children)

        by Anonymous Coward on Saturday September 09 2023, @05:40PM (#1323882)

        https://github.com/civitai/civitai [github.com]
        https://www.crunchbase.com/organization/civitai [crunchbase.com]

        Check the link. Use a VPN if you must.

        • (Score: 1) by pTamok on Saturday September 09 2023, @05:59PM

          by pTamok (3042) on Saturday September 09 2023, @05:59PM (#1323884)

          Thank-you

        • (Score: 0) by Anonymous Coward on Sunday September 10 2023, @12:10PM (1 child)

          by Anonymous Coward on Sunday September 10 2023, @12:10PM (#1323959)

          God damn.. it's civitai for normal SD models not some CP site. That someone is scared to click this link is part of the problem and you've already smeared the community by acting like they are some bastion of CP.

          • (Score: 0) by Anonymous Coward on Sunday September 10 2023, @12:57PM

            by Anonymous Coward on Sunday September 10 2023, @12:57PM (#1323963)

            I did nothing wrong. I gave them the link and told them to click it. I wouldn't link CP, but civitai is a bastion of coomer bait.

      • (Score: 2) by maxwell demon on Saturday September 09 2023, @05:45PM (2 children)

        by maxwell demon (1608) on Saturday September 09 2023, @05:45PM (#1323883) Journal

        Or you may just have wanted to help, as happened to a teacher in Germany.

        Unfortunately I couldn't find anything about it in English, therefore here a German language link:

        https://www.heise.de/news/Sexualstrafrecht-Lehrerin-will-bei-intimen-Video-vermitteln-Anklage-folgt-9288687.html [heise.de]

        (I tried to translate the first paragraphs with DeepL, but it seems to hang at the moment)

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 1) by pTamok on Saturday September 09 2023, @06:11PM

          by pTamok (3042) on Saturday September 09 2023, @06:11PM (#1323886)

          My German is just good enough to understand. (I could be wrong.)

          The key words for me are 'judicially correct'.

          But insane in the real world. From the reporting (and of course, we don't know the full story), the consequences for the teacher were unreasonably harsh.

        • (Score: 3, Touché) by Opportunist on Monday September 11 2023, @05:10PM

          by Opportunist (5545) on Monday September 11 2023, @05:10PM (#1324091)

          In a nutshell, a teacher tried to stop a sexually explicit video of a 13 year old girl to get distributed and put it on her phone to show it to the parents, but now she's in possession of child pornography, costing her her job and pretty much ending her career.

          And people ask me why I cross the street and try to get as much distance as I possibly could if I see a child crying all alone. The LAST thing I would want is to get involved in that in ANY way. It's literally better that this kid dies than to get involved in ANY way, I cannot win anything here, but I could effectively lose my life. Pretending that I never saw or heard that kid in distress is the sensible thing to do here.

  • (Score: 2, Interesting) by pTamok on Saturday September 09 2023, @04:29PM (2 children)

    by pTamok (3042) on Saturday September 09 2023, @04:29PM (#1323874)

    You have no way of knowing if it has been drawn from life, or a from photo or video of a real record of abuse.

    And because you have no way of knowing, you have to assume the worst, because if you don't, you've just created a market. Sure, every image creator will say it was drawn entirely from their imagination. Of course they would, wouldn't they?

    Similarly, a disclaimer, like you find in fictional texts, saying that 'any resemblance between the fictional character and someone real is entirely accidental' doesn't particularly help if it looks like a real child. Of course they'll have a disclaimer. They would say that, wouldn't they?

    You can't prove a negative. You can't prove it is not-real. The only winning move is not to play.

    • (Score: 1) by khallow on Sunday September 10 2023, @12:53AM

      by khallow (3766) Subscriber Badge on Sunday September 10 2023, @12:53AM (#1323924) Journal

      And because you have no way of knowing, you have to assume the worst, because if you don't, you've just created a market.

      Why do you have to "assume the worst"?

      You can't prove a negative. You can't prove it is not-real. The only winning move is not to play.

      How about innocent until proven guilty? I think that's a more winning move.

    • (Score: 3, Insightful) by Opportunist on Monday September 11 2023, @05:02PM

      by Opportunist (5545) on Monday September 11 2023, @05:02PM (#1324089)

      So given Stephen King, I should assume he has done all the things he writes about?

      What about Agatha Christie, should I assume she murdered a couple hundred people?

      Why the hell should I "assume the worst"? Oh, right, "think of the children"

      If you think of the children all the time, my first guess would be that you're a pedo.