posted by
hubie
on Friday January 24, @09:53AM
from the sounds-like-they-need-to-arrest-the-AI-too dept.

from the sounds-like-they-need-to-arrest-the-AI-too dept.
Late last year, California passed a law against the possession or distribution of child sex abuse material that has been generated by AI. The law went into effect on January 1, and Sacramento police announced yesterday that they have arrested their first suspect, 49-year-old Pulitzer-prize-winning cartoonist Darrin Bell. The new law, which you can read here, declares that AI-generated CSAM is harmful, even without an actual victim. "The creation of CSAM using AI is inherently harmful to children because the machine-learning models utilized by AI have been trained on datasets containing thousands of depictions of known CSAM victims, revictimizing these real children by using their likeness to generate AI CSAM images into perpetuity."
This discussion was created by hubie (1068) for logged-in users only, but now has been archived.
No new comments can be posted.
New California Law Criminalizing AI Generated Child Porn Claims First Arrest
|
Log In/Create an Account
| Top
| 17 comments
| Search Discussion
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
(1)
(Score: 5, Insightful) by pkrasimirov on Friday January 24, @11:25AM (2 children)
Wait what? Do they really train their AI with child porn? If yes, the problem is not generating new child porn out of it, it's generating the original child porn in the first place, and using it, and distributing it. OMG
(Score: 4, Interesting) by zocalo on Friday January 24, @11:59AM (1 child)
The thought does occur that it is quite likely with the typical unfettered trawling of web data the AI crawlers are engaged in there's a very high probability of CSAM being scraped into the datasets, and it's not like the hosters of the CSAM getting scraped are going to complain and ask for it to be removed, is it? That being the case, the logical next questions would be how much CSAM actually has been scraped into the datasets, and how messy things might get if one of the [N]GOs that typically deal with CSAM issues can demonstrate it's there.
UNIX? They're not even circumcised! Savages!
(Score: 1, Insightful) by Anonymous Coward on Friday January 24, @03:05PM
Whether you're a criminal or not is not supposed to be based on assumption. Warrants, yes - they get a judge to sign one, and then they come and look for proof. If they don't find that "CSAM they already have," then you're not breaking the law.
In this case, they are explicitly illegalizing the "looks-like-to-me" output of an AI image generator. Better hope *you* never manage to have one misunderstand you.
(Score: 1, Interesting) by Anonymous Coward on Friday January 24, @02:41PM (7 children)
It's still child porn right? Or was there a ruling that somehow it wasn't "because AI". What did the law say that made this neccessary? Maybe they were worried that if no actual child was photographed (untrue, the training set) this might somehow slip by, but if someone drew it by hand wouldn't that be a similar situation?
(Score: 3, Informative) by VLM on Friday January 24, @02:47PM
In my other post:
So, the analogy of asking a dude to "draw it" vs asking a LLM to "draw it".
I suspect the problem is not out of nothing creativity but AI-assisted photoshop type editing where it starts with a normal pic of a fully dressed kid and at some point extensive AI photo editing would become illegal so its easier to just make it all illegal.
(Score: 3, Interesting) by janrinok on Friday January 24, @03:01PM (1 child)
As you have correctly pointed out there is no need for any additional law just because it is AI.
The current law actually talks about 'creating an image' which covers any method whatsoever, even those methods that have not yet been invented.
I believe the problem in this case stems from the voracious appetite of AI during training. The companies do not produce all their own data; they use data from any source that they can and we know that they also trawl the internet to find more data. They do not vet every (or even any?) site that they scrape and it is therefore likely that some illegal material will have been used. Other companies combine existing data sets to produce a large data set which they believe will meet their needs. Any commercial data of size will possibly be contaminated by CSAM but it is not in a format which makes it easily recognisable.
Some here on SN have suggested polluting data that is being scraped to defeat AI. Perhaps it has already happened unintentionally but I don't suppose any technology company will be held responsible. I know several people, including myself, who regularly use Stable Diffusion or one of the many similar software packages. We have all seen images produced that do not meet the 'prompt' that we have provided and some have obviously been based on pornographic images although not necessarily CSAM.
I have no doubt that there will be some who intentionally create CSAM either for their own criminal use or to sell as data to be included in a larger package e.g "1 million images of people in normal situations". Do you think anyone is going to look at each of those 1,000,000 images before they use them? I doubt it.
[nostyle RIP 06 May 2025]
(Score: 5, Interesting) by bussdriver on Friday January 24, @07:21PM
Well, intentioned people make stupid broad policies all the time! A functioning democracy or organization will be able to try out policies rapidly and maybe correct them quickly as well. Some things just have to be "common sense" (which seems to be rare as the French say) and left out of policy.
The whole issue is insane because people can't think clearly on the topic and in the USA, science and reason are under assault so there is no hope of rational ...anything; I don't bother to even discuss the topic:
1) there is no proof that porn causes more rape. I have seen some things saying it might decrease it slightly. Tons of puritans claiming BS for centuries, they used to claim alcohol created all sort of evils (it never is the source; humans are always the problem.)
2) ALL pedophiles like adults too! they might prefer infants, children, or teens (puberty being the teen threshold) each of which is a different kind of pedophile. This can be a preference like hair color. For active pedos, it's because younger people are far easier to exploit - and then they are not far from (if not same as) all the others who try to exploit power differences. Old men with money going after younger women often are a mild form (not always exploitation; hell, the young one can be the primary exploiter.) Some rare situations are exploit free; they can be legit "in love" and so on (but being far outside the norm it becomes sick... if that isn't mentally ill, nothing is.)
3) Genetic connections have been made; it's harder when you are not scientific to gather data. Teenagers used to get married as adults; you are wrecking the dataset if you don't separate the teenagers out. 16-21 is arbitrary, not scientific. This genetic defect merely makes somebody attracted to children. Some day it could be identified; however, we shouldn't pick on these people because it's no different than being attracted to adults: they CHOOSE to exploit or attack another person. Going after a child is like going after a drunk person: it's easier. They might not even be bad enough to violently attack somebody but bad enough to exploit somebody or drug them to then exploit the situation and so on. Thinking more clearly, the punishments should be the SAME! We treat drugging women far nicer than pedophilia. Violent forced attacks are not the same and should be handled differently. Furthermore, some people will fool themselves thinking the drunk women wanted it and rationalizing it to themselves - the topic then becomes blurry. Add to that a man who's physically unable to differentiate children... they can be confused; but not about their societal rules - they know they are breaking those. but again, confused - if society banned you from your love like Romeo & Juliet. The couples can make such arguments; not hard to find (I'm speaking of teenagers, children can't yet grasp those arguments.) When you realize the truth; they are birth defects, so you'd treat them like anybody else. Also it brings up the issue of allowing genetic defects to spread their defects onto their children... You could be one; if you have a lustful feeling towards children at times but you immediately police yourself you may not even realize it if well policed. Like how a religious person may limit or deny their lustful impulses when watching advertising (loaded with sex appeal and if you don't feel anything then your system 2 blocking your awareness, system 1 functions without consciousness.) So back to the point, you could be born a pedo and not even realize it. I could make gay analogies but people smear them so much as it is and they understandably get too defensive to get the point... but we know a lot of people get confused about their sexual orientation. You just don't want to think that happens with pedos too and people with some thoughts will be even more afraid than a homophobic person (as we know the extreme ones had thoughts...at the root of their fear driven extreme position.)
4) Generative AI easily can be trained without any CSAM. WTF? you can generate tons of things not even remotely in the training data. Have these people never played at all with a generator?
5) generated is harm free; although, all the old materials are doing no new harm and when it becomes so old that the children have died of old age it's going to be even more ridiculous of a claim. Again, back to #1. But also, it is TRUE making a business from those materials must be a severe crime because it funds creation of new material. Possessing freely distributed materials doesn't directly promote anything. I'm not against it being a mild crime but it really should be a "mental crime" in that we need a psychology branch of policing. Not a crime, but a health condition. Addicts are mental health problems not criminals. Possession of such materials is AN INDICATOR. Doesn't mean they'll act upon it any more than people watching rape fantasy porn. it also becomes a tough judgement call to spot possible rapists from their fantasies - experts will have a high failure rate. The solution is to remove stigmas and involve at least some psychologist crafted disclaimers. Pedos can be treated more like suicide attempts as far as the medical response. You could drug their sex drive as you might for rapists. (though my position on rapists is outside psychology as I advocate the death penalty and I'm against that for murders. yes, since active pedos are likely committing rape they should die too. The damage caused by rape can be worse than murder, the age of the victim doesn't matter. except in reality the younger victim has better odds on recovery.)
6) So, we're going to next outlaw sex robots? the early ones are likely to be small guardless of looks simply due to cost. and look young. so it'll be like teenagers who hit puberty early. How are they going to regulate those? I doubt any study will be done on if those reduce real problems. We don't like to discuss legal prostitution reducing rape so I don't expect any progress. It's always about controlling others and imposing your unfounded beliefs upon others.
I grossly over summarized leaving room for tons of criticisms. I suggest you dig rather than have me clarify books worth of reading and multiple in-person interviews I've had with rape and pedo victims.
(Score: 2, Interesting) by Anonymous Coward on Friday January 24, @03:08PM (2 children)
What's the age of the person in that AI image?
How does that age/look compare to a 13 year old covered in make-up? How does it appear to a 21 year old with big eyes? How does it compare to a dwarf with supple skin? Does it have elf ears (which the California prosecutor said explicitly makes it non-human, and thus not subject to these laws)?
How do establish the *legal* age of a fictional creation? (Once you do, can they vote?)
(Score: 0) by Anonymous Coward on Friday January 24, @03:24PM (1 child)
Just a few seconds old after they are generated, duh!
(Score: 0) by Anonymous Coward on Friday January 24, @04:54PM
Oh my.
Those AI-generated images of Donald's look-alike are problematic in a whole new way...
(Score: 2) by Tork on Friday January 24, @05:00PM
Maybe but not always. And I don't mean there's an obscure fringe case, I mean a meatbag doesn't need to see actual child harm to draw an image depicting it. Consider how movies get made. Example: Earth as seen from orbit was depicted in movies long before we ever actually had a photograph of what Earth looks like from space. But you are right that there'll be cases where an artist's reference material could come into play.
🏳️🌈 Proud Ally 🏳️🌈
(Score: 5, Informative) by VLM on Friday January 24, @02:42PM (1 child)
I looked at the actual law it as its legally interesting.
Looks like they're busting people for both prompts and results in parallel. In theory, you could get busted for a result you didn't ask for. However, read on:
Its definitions are full of "knowingly". So if you ask for a picture of "Charlie Brown and Lucy playing with a football" and you get a bad picture, you seem to be excluded from prosecution as long as you don't keep it, distribute it, etc. The guy they busted must have been quite literally asking for it, in the prompts he wrote. If you're not dumb enough to ask for it, it'll likely be a tough prosecution.
Paragraph 311.11 (d) (1) "does not apply to drawings, figurines, or statues." I'm sure this will turn into a huge shitshow where asking for a 3-d printed statue of a bad object is semi-legal as long as the STL file never enters the state, although the STL file would be illegal. Kind of mystified how those are legal. Obviously some CA politician's family members must sell "special figurines" as a side gig (or maybe main job, who knows?)
There's other stuff that's just novel to read, for punishment purposes a video is worth exactly 50 pictures. Not 51, not 49, exactly 50. Did not search deeply enough to find out if they also encoded into law that a picture is worth a thousand words LOL.
Another oddity: Kids can do whatever they want in a movie rated by the MPAA. A video tape of "Romeo and Juliette" rated by the MPAA and bought at a store is legal because its written into law that by definition no MPAA rated movies are illegal; a home camcorder of a public presentation of a theater play MIGHT be CP until it's submitted for MPAA rating.
CA usually does a bad job of everything, this is not as bad as usual but not perfect either. Overall it was an interesting read.
(Score: 4, Insightful) by Whoever on Friday January 24, @04:12PM
This is similar to how prostitution is illegal, unless there is a camera crew there and the intent to to create a pornographic movie.
(Score: 5, Interesting) by ElizabethGreene on Friday January 24, @04:07PM (3 children)
This gets dubious pretty quickly. "The nonexistent person in your AI generated image looks too young. You're going to jail."
I knew a woman that looked 14 when she was in her late 20s. She had a strict no-PDA* rule with her husband because he'd been confronted and assaulted by "good Samaritans" multiple times.
* Public Display of Affection, not Personal Digital Assistant
(Score: 1, Interesting) by Anonymous Coward on Friday January 24, @06:07PM
How about manga and anime... Some of those centuries old demons and gods/demigods look like little girls.
Yeah there are many like that. Also many Asians look young. For example, many don't realize that this singer was about 27 years old at the time of the video and older than Doja Cat: https://www.youtube.com/watch?v=EsZbWAqU8xY [youtube.com] )
See also: https://imgur.com/aging-asians-HNJFgbC [imgur.com]
p.s. which is why I do wonder whether some accusations of child labor by "western media" are really true... "Oh no, this factory is full of underaged girls working, shut it down!".
(Score: 3, Interesting) by Thexalon on Friday January 24, @10:12PM (1 child)
It's also a very stupid law.
Pedos exist, and to the best of my knowledge there is no known cure or treatment for the condition other than high-velocity lead poisoning. To the best of my knowledge, they haven't even proven that lopping off gonads would solve the problem. They're going to get their jollies somewhere. I'd much much rather they be getting their jollies from watching AI-generated or animated smut than live-action smut, and much rather they be getting their jollies from watching smut than violating the real live children around them.
"Think of how stupid the average person is. Then realize half of 'em are stupider than that." - George Carlin
(Score: 0) by Anonymous Coward on Saturday January 25, @02:50AM
Seriously though, the argument vs child porn, is similar to the argument vs porn, and also "violent video games/movies".
There are tons of people playing games where they enjoy killing tons of people, but they don't do it in real life. Same for the porn thing - tons of "virgins for life" watching porn out there...
Not everyone who watches MILF porn will go around having sex with MILFs...