Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday November 18 2019, @02:43AM   Printer-friendly
from the problem-of-our-own-making dept.

Submitted via IRC for SoyCow1337

How Laws Against Child Sexual Abuse Imagery Can Make It Harder to Detect

Child sexual abuse photos and videos are among the most toxic materials online. It is against the law to view the imagery, and anybody who comes across it must report it to the federal authorities.

So how can tech companies, under pressure to remove the material, identify newly shared photos and videos without breaking the law? They use software — but first they have to train it, running repeated tests to help it accurately recognize illegal content.

Google has made progress, according to company officials, but its methods have not been made public. Facebook has, too, but there are still questions about whether it follows the letter of the law. Microsoft, which has struggled to keep known imagery off its search engine, Bing, is frustrated by the legal hurdles in identifying new imagery, a spokesman said.

The three tech giants are among the few companies with the resources to develop artificial intelligence systems to take on the challenge. One route for the companies is greater cooperation with the federal authorities, including seeking permission to keep new photos and videos for the purposes of developing the detection software.

But that approach runs into a larger privacy debate involving the sexual abuse material: How closely should tech companies and the federal government work to shut it down? And what would prevent their cooperation from extending to other online activity?

Paul Ohm, a former prosecutor in the Justice Department's computer crime and intellectual property section, said the laws governing child sexual abuse imagery were among the "fiercest criminal laws" on the books.

"Just the simple act of shipping the images from one A.I. researcher to another is going to implicate you in all kinds of federal crimes," he said.

[...] Companies in other countries are facing similar hurdles. Two Hat Security in Canada, for instance, spent years working with the authorities there to develop a system that detects child sexual abuse imagery. Because the company couldn't view or possess the imagery itself, it had to send its software to Canadian officials, who would run the training system on the illegal images and report back the results. The company would then fine-tune the software and send it back for another round of training.

The system has been in development for three to four years, said the company's chief executive, Chris Priebe.

"It's a slow process," he said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday November 18 2019, @03:30AM (20 children)

    by Anonymous Coward on Monday November 18 2019, @03:30AM (#921409)

    anybody who comes across it must report it to the federal authorities.

    Oh really? Why would I believe anything the NYT puts on paper nowadays, but as "It is against the law to view the imagery", it behooves anyone running into what could be construed as CP to make a speedy U-turn away, probably wipe and reinstall their systems from backup, lest one becomes the easy victim of law enforcement and the 6 o'clock news.

  • (Score: 2) by Pino P on Monday November 18 2019, @03:33AM (13 children)

    by Pino P (4721) on Monday November 18 2019, @03:33AM (#921411) Journal

    One can find drawn CP through Google Images with ostensibly innocuous query words, such as lie to me pinocchio.

    • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @03:49AM (11 children)

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @03:49AM (#921413) Homepage Journal

      Drawn images don't have actual children in them. So not illegal or even immoral, just fucked up.

      --
      My rights don't end where your fear begins.
      • (Score: 2) by shortscreen on Monday November 18 2019, @04:22AM (8 children)

        by shortscreen (2252) on Monday November 18 2019, @04:22AM (#921422) Journal

        Someone has already been convicted for this, so precedent says it is illegal.

        • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @01:14PM (7 children)

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @01:14PM (#921483) Homepage Journal

          In the US? Fake News about Fake Nudes? Which is to say, Citation Needed.

          --
          My rights don't end where your fear begins.
          • (Score: 2) by FatPhil on Monday November 18 2019, @04:33PM (3 children)

            by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Monday November 18 2019, @04:33PM (#921564) Homepage
            I'd say this one falls into the "well known, old news" camp (well, not that old, still a teenager), you must be rip van winkel if you didn't notice it, and are being either lazy or dim if you can't come up with a search expression that would satisfy your request immediately.
            --
            Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
            • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @08:00PM (2 children)

              by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @08:00PM (#921650) Homepage Journal

              Why would I put the effort in to verify someone else's claim? And, yes, I don't watch TV or pay attention to mainstream news online. I check a couple dozen tech/science/games feeds regularly and figure anything important outside that will get brought up by someone else.

              --
              My rights don't end where your fear begins.
          • (Score: 0) by Anonymous Coward on Monday November 18 2019, @07:40PM

            by Anonymous Coward on Monday November 18 2019, @07:40PM (#921639)

            In the US? Fake News about Fake Nudes? Which is to say, Citation Needed.

            I'm at work so don't want to search myself, but search for the for the conviction of somebody for child pornography of Lisa Simpson (as in the cartoon character from The Simpsons). You should be able to find the citation yourself.

          • (Score: 2) by PinkyGigglebrain on Monday November 18 2019, @07:54PM (1 child)

            by PinkyGigglebrain (4458) on Monday November 18 2019, @07:54PM (#921647)

            Just in case you weren't being facetious.

            Citation Needed.

            as requested;

            http://beforeitsnews.com/eu/2012/09/child-porn-laws-arent-as-bad-as-you-think-theyre-much-much-worse-2449840.html [beforeitsnews.com]
            (unfortunately the site removes images from older articles)

            http://cbldf.org/criminal-prosecutions-of-manga/ [cbldf.org]

            --
            "Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
      • (Score: 0) by Anonymous Coward on Monday November 18 2019, @04:24AM (1 child)

        by Anonymous Coward on Monday November 18 2019, @04:24AM (#921424)

        Drawn images don't have actual children in them. So not illegal in the United States, but is illegal in many other places or even immoral, just fucked up.

        There. FTFY.

        https://en.wikipedia.org/wiki/Legality_of_child_pornography [wikipedia.org]

        • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @01:21PM

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @01:21PM (#921485) Homepage Journal

          I've no reason to be concerned with elsewhere. The only reason I might ever have left the US (going after a Wels catfish in Europe) isn't worth putting up with oppressive European bullshit for. Not even for a week.

          --
          My rights don't end where your fear begins.
    • (Score: 0) by Anonymous Coward on Monday November 18 2019, @04:40AM

      by Anonymous Coward on Monday November 18 2019, @04:40AM (#921430)

      I'm afraid I can't frame the content of the results you would get from that query for discussion, without falling afoul of the laws.

  • (Score: 4, Insightful) by shortscreen on Monday November 18 2019, @04:20AM (1 child)

    by shortscreen (2252) on Monday November 18 2019, @04:20AM (#921421) Journal

    IANAL but (or as a result of that...) I would think a law requiring someone to incriminate themselves violates the 5th amendment. "Seeing X is illegal, and if you see X you must report it"

    • (Score: 3, Interesting) by Runaway1956 on Monday November 18 2019, @05:29AM

      by Runaway1956 (2926) Subscriber Badge on Monday November 18 2019, @05:29AM (#921436) Journal

      True. We also need to 5th to apply to our electronic devices. But, law enforcement doesn't want to hear any of that. Refusing to turn over incriminating evidence only results in more charges being filed against you.

  • (Score: 2) by bradley13 on Monday November 18 2019, @02:40PM (3 children)

    by bradley13 (3053) on Monday November 18 2019, @02:40PM (#921510) Homepage Journal

    Which makes it kind of obvious why just coming across CP should not be illegal. Producing it - yes. Knowingly purchasing it - yes. Being stupid, by picking up a random thumb-drive from the parking lot? Having a malware infection that wants to blackmail you? Clicking on the wrong link while searching for porn? No. And you shouldn't have to prove your innocence in such cases either; presumption of innocence.

    More generally, possessing information of any sort should never be illegal. Whether it's industrial trade secrets, CP, Obama's birth certificate, Trump's tax returns, or nude pics of Hillary - possession should be absolutely legal. How you obtained that information may be a crime (e.g., if you stole those trade secrets, or broke into the IRS to get Hillary's birth certificate). Distributing information may be a crime (passing copies of those trade secrets, or of CP).

    Now for a question I can't answer: If you possess information that you received in a non-criminal fashion (e.g., found a thumb drive): Since it's legal for you to possess it, should it be possible to force you to delete it? You don't have the right to own those industrial birth returns, but forced deletion seems like a weird legal edge case.

    --
    Everyone is somebody else's weirdo.
    • (Score: 2) by dry on Monday November 18 2019, @10:29PM (2 children)

      by dry (223) on Monday November 18 2019, @10:29PM (#921715) Journal

      There's also the edge cases, viewing a supposed 20 yr old who turns out to be 16, which in theory can put you in jail for a long time.

      • (Score: 0) by Anonymous Coward on Tuesday November 19 2019, @09:17AM (1 child)

        by Anonymous Coward on Tuesday November 19 2019, @09:17AM (#921874)

        That's not really an edge case, there are no edge cases. You go to jail and the kid gets tried as an adult for producing child porn of themselves and goes to jail. There was a case of two underage kids sending each other pics of themselves and they both were charged as adults for possession and production.

        Then there's cases of media of people over 24 who look younger being considered child porn as well (not sure if that was USA or not). Basically if it looks like it might be child porn by any definition then it is child porn. Your baby photos included if your local police don't like you.

        • (Score: 2) by dry on Tuesday November 19 2019, @04:03PM

          by dry (223) on Tuesday November 19 2019, @04:03PM (#921960) Journal

          It depends on jurisdiction. I know where I am, sexting is not considered child porn unless widely shared. I think the 24 yr old thing was only in Australia, and of course the age of consent varies, which can lead to be being charged for viewing child porn when the people involved were of legal age where they shot the video.
          The whole thing is crazy, treating young women or even drawings the same as little kids is just one example.