Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday November 18 2019, @02:43AM   Printer-friendly
from the problem-of-our-own-making dept.

Submitted via IRC for SoyCow1337

How Laws Against Child Sexual Abuse Imagery Can Make It Harder to Detect

Child sexual abuse photos and videos are among the most toxic materials online. It is against the law to view the imagery, and anybody who comes across it must report it to the federal authorities.

So how can tech companies, under pressure to remove the material, identify newly shared photos and videos without breaking the law? They use software — but first they have to train it, running repeated tests to help it accurately recognize illegal content.

Google has made progress, according to company officials, but its methods have not been made public. Facebook has, too, but there are still questions about whether it follows the letter of the law. Microsoft, which has struggled to keep known imagery off its search engine, Bing, is frustrated by the legal hurdles in identifying new imagery, a spokesman said.

The three tech giants are among the few companies with the resources to develop artificial intelligence systems to take on the challenge. One route for the companies is greater cooperation with the federal authorities, including seeking permission to keep new photos and videos for the purposes of developing the detection software.

But that approach runs into a larger privacy debate involving the sexual abuse material: How closely should tech companies and the federal government work to shut it down? And what would prevent their cooperation from extending to other online activity?

Paul Ohm, a former prosecutor in the Justice Department's computer crime and intellectual property section, said the laws governing child sexual abuse imagery were among the "fiercest criminal laws" on the books.

"Just the simple act of shipping the images from one A.I. researcher to another is going to implicate you in all kinds of federal crimes," he said.

[...] Companies in other countries are facing similar hurdles. Two Hat Security in Canada, for instance, spent years working with the authorities there to develop a system that detects child sexual abuse imagery. Because the company couldn't view or possess the imagery itself, it had to send its software to Canadian officials, who would run the training system on the illegal images and report back the results. The company would then fine-tune the software and send it back for another round of training.

The system has been in development for three to four years, said the company's chief executive, Chris Priebe.

"It's a slow process," he said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by dry on Monday November 18 2019, @10:29PM (2 children)

    by dry (223) on Monday November 18 2019, @10:29PM (#921715) Journal

    There's also the edge cases, viewing a supposed 20 yr old who turns out to be 16, which in theory can put you in jail for a long time.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Tuesday November 19 2019, @09:17AM (1 child)

    by Anonymous Coward on Tuesday November 19 2019, @09:17AM (#921874)

    That's not really an edge case, there are no edge cases. You go to jail and the kid gets tried as an adult for producing child porn of themselves and goes to jail. There was a case of two underage kids sending each other pics of themselves and they both were charged as adults for possession and production.

    Then there's cases of media of people over 24 who look younger being considered child porn as well (not sure if that was USA or not). Basically if it looks like it might be child porn by any definition then it is child porn. Your baby photos included if your local police don't like you.

    • (Score: 2) by dry on Tuesday November 19 2019, @04:03PM

      by dry (223) on Tuesday November 19 2019, @04:03PM (#921960) Journal

      It depends on jurisdiction. I know where I am, sexting is not considered child porn unless widely shared. I think the 24 yr old thing was only in Australia, and of course the age of consent varies, which can lead to be being charged for viewing child porn when the people involved were of legal age where they shot the video.
      The whole thing is crazy, treating young women or even drawings the same as little kids is just one example.