Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday November 18 2019, @02:43AM   Printer-friendly
from the problem-of-our-own-making dept.

Submitted via IRC for SoyCow1337

How Laws Against Child Sexual Abuse Imagery Can Make It Harder to Detect

Child sexual abuse photos and videos are among the most toxic materials online. It is against the law to view the imagery, and anybody who comes across it must report it to the federal authorities.

So how can tech companies, under pressure to remove the material, identify newly shared photos and videos without breaking the law? They use software — but first they have to train it, running repeated tests to help it accurately recognize illegal content.

Google has made progress, according to company officials, but its methods have not been made public. Facebook has, too, but there are still questions about whether it follows the letter of the law. Microsoft, which has struggled to keep known imagery off its search engine, Bing, is frustrated by the legal hurdles in identifying new imagery, a spokesman said.

The three tech giants are among the few companies with the resources to develop artificial intelligence systems to take on the challenge. One route for the companies is greater cooperation with the federal authorities, including seeking permission to keep new photos and videos for the purposes of developing the detection software.

But that approach runs into a larger privacy debate involving the sexual abuse material: How closely should tech companies and the federal government work to shut it down? And what would prevent their cooperation from extending to other online activity?

Paul Ohm, a former prosecutor in the Justice Department's computer crime and intellectual property section, said the laws governing child sexual abuse imagery were among the "fiercest criminal laws" on the books.

"Just the simple act of shipping the images from one A.I. researcher to another is going to implicate you in all kinds of federal crimes," he said.

[...] Companies in other countries are facing similar hurdles. Two Hat Security in Canada, for instance, spent years working with the authorities there to develop a system that detects child sexual abuse imagery. Because the company couldn't view or possess the imagery itself, it had to send its software to Canadian officials, who would run the training system on the illegal images and report back the results. The company would then fine-tune the software and send it back for another round of training.

The system has been in development for three to four years, said the company's chief executive, Chris Priebe.

"It's a slow process," he said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Mojibake Tengu on Monday November 18 2019, @03:26AM (5 children)

    by Mojibake Tengu (8598) on Monday November 18 2019, @03:26AM (#921406) Journal

    Lawmakers could handle this by introduction of strict licensing, controlled by state. Just like they did with nuclear materials, heavy weapons trade, explosives, dangerous chemicals. It's not perfect model, but it works mostly, in any country.

    Why they don't have an idea to do this, remains to be a mystery. A conflict of interests?

    --
    Respect Authorities. Know your social status. Woke responsibly.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by shortscreen on Monday November 18 2019, @04:26AM (3 children)

    by shortscreen (2252) on Monday November 18 2019, @04:26AM (#921427) Journal

    What sort of person would apply for a license to view child porn?

    • (Score: 2) by c0lo on Monday November 18 2019, @05:34AM

      by c0lo (156) Subscriber Badge on Monday November 18 2019, @05:34AM (#921437) Journal

      It seems that some corporate persons would, ain't it?
      At least this is what the TFS/A imply.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 3, Touché) by Mojibake Tengu on Monday November 18 2019, @06:11AM (1 child)

      by Mojibake Tengu (8598) on Monday November 18 2019, @06:11AM (#921441) Journal

      What sort of person would apply for a license to kill people?

      --
      Respect Authorities. Know your social status. Woke responsibly.
      • (Score: 2) by dry on Monday November 18 2019, @10:13PM

        by dry (223) on Monday November 18 2019, @10:13PM (#921709) Journal

        James Bond?

  • (Score: 2) by maxwell demon on Monday November 18 2019, @11:52AM

    by maxwell demon (1608) on Monday November 18 2019, @11:52AM (#921473) Journal

    Are you the Patrician of Ankh-Morpork?

    --
    The Tao of math: The numbers you can count are not the real numbers.