Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 7 submissions in the queue.
posted by janrinok on Monday November 18 2019, @02:43AM   Printer-friendly
from the problem-of-our-own-making dept.

Submitted via IRC for SoyCow1337

How Laws Against Child Sexual Abuse Imagery Can Make It Harder to Detect

Child sexual abuse photos and videos are among the most toxic materials online. It is against the law to view the imagery, and anybody who comes across it must report it to the federal authorities.

So how can tech companies, under pressure to remove the material, identify newly shared photos and videos without breaking the law? They use software — but first they have to train it, running repeated tests to help it accurately recognize illegal content.

Google has made progress, according to company officials, but its methods have not been made public. Facebook has, too, but there are still questions about whether it follows the letter of the law. Microsoft, which has struggled to keep known imagery off its search engine, Bing, is frustrated by the legal hurdles in identifying new imagery, a spokesman said.

The three tech giants are among the few companies with the resources to develop artificial intelligence systems to take on the challenge. One route for the companies is greater cooperation with the federal authorities, including seeking permission to keep new photos and videos for the purposes of developing the detection software.

But that approach runs into a larger privacy debate involving the sexual abuse material: How closely should tech companies and the federal government work to shut it down? And what would prevent their cooperation from extending to other online activity?

Paul Ohm, a former prosecutor in the Justice Department's computer crime and intellectual property section, said the laws governing child sexual abuse imagery were among the "fiercest criminal laws" on the books.

"Just the simple act of shipping the images from one A.I. researcher to another is going to implicate you in all kinds of federal crimes," he said.

[...] Companies in other countries are facing similar hurdles. Two Hat Security in Canada, for instance, spent years working with the authorities there to develop a system that detects child sexual abuse imagery. Because the company couldn't view or possess the imagery itself, it had to send its software to Canadian officials, who would run the training system on the illegal images and report back the results. The company would then fine-tune the software and send it back for another round of training.

The system has been in development for three to four years, said the company's chief executive, Chris Priebe.

"It's a slow process," he said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by MostCynical on Monday November 18 2019, @02:52AM (3 children)

    by MostCynical (2589) on Monday November 18 2019, @02:52AM (#921389) Journal

    train AI to recognize child porn.

    How long before the AI needs therapy?

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
    • (Score: 2, Touché) by fustakrakich on Monday November 18 2019, @03:03AM (2 children)

      by fustakrakich (6150) on Monday November 18 2019, @03:03AM (#921396) Journal

      train AI to recognize child porn

      and it will create its own.

      --
      La politica e i criminali sono la stessa cosa..
      • (Score: 0) by Anonymous Coward on Monday November 18 2019, @03:24AM

        by Anonymous Coward on Monday November 18 2019, @03:24AM (#921405)

        No need. Humans already produce millions of rule 34 loli drawings every day.

      • (Score: -1, Troll) by Anonymous Coward on Monday November 18 2019, @03:38AM

        by Anonymous Coward on Monday November 18 2019, @03:38AM (#921412)

        The problem is NOT the published content, it's where they are getting the individuals to produce it. If you want to see what the 'Deep State' looks like, go look at it from enabling pedophilia and other sexually abusive activities. It is closer than you think. At your children's school, at your local police station, FBI branch, at those 'VIP parties' you never hear about directly but know they have, at camps, group activities like scouts, religion centers, sports activities.

        Epstein doesn't scratch the surface. All the 'Think of the Children' legislation is intended for the opposite effect: creating more sheep they can easily herd into their pens of depravity. Having spoken with individuals who went through this, most of the people are both ruthless and careful, have a network that help them stay out of the headlines because people who are too powerful to be named are involved. For those who don't remember, one of Epstein's clients was a Prince of the UK Royal Family.

        All this shit is doing is helping throw some lone perverts in the headlines to pretend they are doing something, most of the time the ones you don't have to worry about, or the ones who fell out of favor. Epstein's blackmail materials were an example, as was the fire that later consumed them.

        If you want to make an actual difference, start going and looking at activities your child may be involved in. Look at people you presume to trust without question in their lives. Look at the people viewing competitions and events you let them participate in. If you have chosen the wrong ones you will start noticing the tells even if you can't end up in the situation to prove them.

        Published child abuse imagery can be damaging, but it is the unpublished material which is used to 'trap' participants so they won't speak out. And once they are too far in they can't or won't take the risks to come back to a more normal life, even if they agree it should not happen to others.

        You can be naive and pretend this will make a difference, or go do what is necessary to discover for yourself and shine light on the darkness of the world, but understand the politicians and these laws are only meant to help throw shade on their own activities while stripping rights from you for their own current and future benefit while making it easier for them to find prey. Teach your own children where the limits on respect of authority are, and what sort of abuses predatory authority figures may attempt to use on them or their peers. Also, don't shine light on a single illicit relationship until you've made sure there aren't bigger cockroaches nearby waiting to scurry away from the light before you can snare them with a trap.

        As a final thought: Ask yourself what made all those Olympic competitors prone to so much sexual activity and what sort of grooming may have gone on during all those years of training to become the best of the best for their narrow realm of competition. Olympic training isn't cheap after all, and patrons can be few and far between...

  • (Score: 2) by Rosco P. Coltrane on Monday November 18 2019, @03:00AM (3 children)

    by Rosco P. Coltrane (4757) on Monday November 18 2019, @03:00AM (#921394)

    There are plenty of conviced pedophile in jail who are very adept at finding that shit. Stick a computer in front of them: they'll happily work for the police 18 hours a week.

    • (Score: 0) by Anonymous Coward on Monday November 18 2019, @03:12AM

      by Anonymous Coward on Monday November 18 2019, @03:12AM (#921400)

      There are plenty of conviced pedophile in jail who are very adept at finding that shit. Stick a computer in front of them: they'll happily work for the police 18 hours a week.

      Yeah! Join the fight for $15! [fightfor15.org]

    • (Score: 1, Touché) by Anonymous Coward on Monday November 18 2019, @03:27AM (1 child)

      by Anonymous Coward on Monday November 18 2019, @03:27AM (#921408)

      If you're good at something, never do it for free.

      • (Score: 0) by Anonymous Coward on Monday November 18 2019, @07:17PM

        by Anonymous Coward on Monday November 18 2019, @07:17PM (#921634)

        Sounds like a plan. Hope you like your stay in gen pop; protective custody is for people who cooperate.

  • (Score: 2) by Mojibake Tengu on Monday November 18 2019, @03:26AM (5 children)

    by Mojibake Tengu (8598) on Monday November 18 2019, @03:26AM (#921406) Journal

    Lawmakers could handle this by introduction of strict licensing, controlled by state. Just like they did with nuclear materials, heavy weapons trade, explosives, dangerous chemicals. It's not perfect model, but it works mostly, in any country.

    Why they don't have an idea to do this, remains to be a mystery. A conflict of interests?

    --
    Respect Authorities. Know your social status. Woke responsibly.
    • (Score: 2) by shortscreen on Monday November 18 2019, @04:26AM (3 children)

      by shortscreen (2252) on Monday November 18 2019, @04:26AM (#921427) Journal

      What sort of person would apply for a license to view child porn?

      • (Score: 2) by c0lo on Monday November 18 2019, @05:34AM

        by c0lo (156) Subscriber Badge on Monday November 18 2019, @05:34AM (#921437) Journal

        It seems that some corporate persons would, ain't it?
        At least this is what the TFS/A imply.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 3, Touché) by Mojibake Tengu on Monday November 18 2019, @06:11AM (1 child)

        by Mojibake Tengu (8598) on Monday November 18 2019, @06:11AM (#921441) Journal

        What sort of person would apply for a license to kill people?

        --
        Respect Authorities. Know your social status. Woke responsibly.
        • (Score: 2) by dry on Monday November 18 2019, @10:13PM

          by dry (223) on Monday November 18 2019, @10:13PM (#921709) Journal

          James Bond?

    • (Score: 2) by maxwell demon on Monday November 18 2019, @11:52AM

      by maxwell demon (1608) on Monday November 18 2019, @11:52AM (#921473) Journal

      Are you the Patrician of Ankh-Morpork?

      --
      The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 0) by Anonymous Coward on Monday November 18 2019, @03:27AM

    by Anonymous Coward on Monday November 18 2019, @03:27AM (#921407)

    I'm surprised they didn't lash out at tech incels and say they must be promoting it because they haven't eliminated it yet.

  • (Score: 0) by Anonymous Coward on Monday November 18 2019, @03:30AM (20 children)

    by Anonymous Coward on Monday November 18 2019, @03:30AM (#921409)

    anybody who comes across it must report it to the federal authorities.

    Oh really? Why would I believe anything the NYT puts on paper nowadays, but as "It is against the law to view the imagery", it behooves anyone running into what could be construed as CP to make a speedy U-turn away, probably wipe and reinstall their systems from backup, lest one becomes the easy victim of law enforcement and the 6 o'clock news.

    • (Score: 2) by Pino P on Monday November 18 2019, @03:33AM (13 children)

      by Pino P (4721) on Monday November 18 2019, @03:33AM (#921411) Journal

      One can find drawn CP through Google Images with ostensibly innocuous query words, such as lie to me pinocchio.

      • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @03:49AM (11 children)

        by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @03:49AM (#921413) Homepage Journal

        Drawn images don't have actual children in them. So not illegal or even immoral, just fucked up.

        --
        My rights don't end where your fear begins.
        • (Score: 2) by shortscreen on Monday November 18 2019, @04:22AM (8 children)

          by shortscreen (2252) on Monday November 18 2019, @04:22AM (#921422) Journal

          Someone has already been convicted for this, so precedent says it is illegal.

          • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @01:14PM (7 children)

            by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @01:14PM (#921483) Homepage Journal

            In the US? Fake News about Fake Nudes? Which is to say, Citation Needed.

            --
            My rights don't end where your fear begins.
            • (Score: 2) by FatPhil on Monday November 18 2019, @04:33PM (3 children)

              by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday November 18 2019, @04:33PM (#921564) Homepage
              I'd say this one falls into the "well known, old news" camp (well, not that old, still a teenager), you must be rip van winkel if you didn't notice it, and are being either lazy or dim if you can't come up with a search expression that would satisfy your request immediately.
              --
              Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
              • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @08:00PM (2 children)

                by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @08:00PM (#921650) Homepage Journal

                Why would I put the effort in to verify someone else's claim? And, yes, I don't watch TV or pay attention to mainstream news online. I check a couple dozen tech/science/games feeds regularly and figure anything important outside that will get brought up by someone else.

                --
                My rights don't end where your fear begins.
            • (Score: 0) by Anonymous Coward on Monday November 18 2019, @07:40PM

              by Anonymous Coward on Monday November 18 2019, @07:40PM (#921639)

              In the US? Fake News about Fake Nudes? Which is to say, Citation Needed.

              I'm at work so don't want to search myself, but search for the for the conviction of somebody for child pornography of Lisa Simpson (as in the cartoon character from The Simpsons). You should be able to find the citation yourself.

            • (Score: 2) by PinkyGigglebrain on Monday November 18 2019, @07:54PM (1 child)

              by PinkyGigglebrain (4458) on Monday November 18 2019, @07:54PM (#921647)

              Just in case you weren't being facetious.

              Citation Needed.

              as requested;

              http://beforeitsnews.com/eu/2012/09/child-porn-laws-arent-as-bad-as-you-think-theyre-much-much-worse-2449840.html [beforeitsnews.com]
              (unfortunately the site removes images from older articles)

              http://cbldf.org/criminal-prosecutions-of-manga/ [cbldf.org]

              --
              "Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
        • (Score: 0) by Anonymous Coward on Monday November 18 2019, @04:24AM (1 child)

          by Anonymous Coward on Monday November 18 2019, @04:24AM (#921424)

          Drawn images don't have actual children in them. So not illegal in the United States, but is illegal in many other places or even immoral, just fucked up.

          There. FTFY.

          https://en.wikipedia.org/wiki/Legality_of_child_pornography [wikipedia.org]

          • (Score: 2) by The Mighty Buzzard on Monday November 18 2019, @01:21PM

            by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Monday November 18 2019, @01:21PM (#921485) Homepage Journal

            I've no reason to be concerned with elsewhere. The only reason I might ever have left the US (going after a Wels catfish in Europe) isn't worth putting up with oppressive European bullshit for. Not even for a week.

            --
            My rights don't end where your fear begins.
      • (Score: 0) by Anonymous Coward on Monday November 18 2019, @04:40AM

        by Anonymous Coward on Monday November 18 2019, @04:40AM (#921430)

        I'm afraid I can't frame the content of the results you would get from that query for discussion, without falling afoul of the laws.

    • (Score: 4, Insightful) by shortscreen on Monday November 18 2019, @04:20AM (1 child)

      by shortscreen (2252) on Monday November 18 2019, @04:20AM (#921421) Journal

      IANAL but (or as a result of that...) I would think a law requiring someone to incriminate themselves violates the 5th amendment. "Seeing X is illegal, and if you see X you must report it"

      • (Score: 3, Interesting) by Runaway1956 on Monday November 18 2019, @05:29AM

        by Runaway1956 (2926) Subscriber Badge on Monday November 18 2019, @05:29AM (#921436) Journal

        True. We also need to 5th to apply to our electronic devices. But, law enforcement doesn't want to hear any of that. Refusing to turn over incriminating evidence only results in more charges being filed against you.

    • (Score: 2) by bradley13 on Monday November 18 2019, @02:40PM (3 children)

      by bradley13 (3053) on Monday November 18 2019, @02:40PM (#921510) Homepage Journal

      Which makes it kind of obvious why just coming across CP should not be illegal. Producing it - yes. Knowingly purchasing it - yes. Being stupid, by picking up a random thumb-drive from the parking lot? Having a malware infection that wants to blackmail you? Clicking on the wrong link while searching for porn? No. And you shouldn't have to prove your innocence in such cases either; presumption of innocence.

      More generally, possessing information of any sort should never be illegal. Whether it's industrial trade secrets, CP, Obama's birth certificate, Trump's tax returns, or nude pics of Hillary - possession should be absolutely legal. How you obtained that information may be a crime (e.g., if you stole those trade secrets, or broke into the IRS to get Hillary's birth certificate). Distributing information may be a crime (passing copies of those trade secrets, or of CP).

      Now for a question I can't answer: If you possess information that you received in a non-criminal fashion (e.g., found a thumb drive): Since it's legal for you to possess it, should it be possible to force you to delete it? You don't have the right to own those industrial birth returns, but forced deletion seems like a weird legal edge case.

      --
      Everyone is somebody else's weirdo.
      • (Score: 2) by dry on Monday November 18 2019, @10:29PM (2 children)

        by dry (223) on Monday November 18 2019, @10:29PM (#921715) Journal

        There's also the edge cases, viewing a supposed 20 yr old who turns out to be 16, which in theory can put you in jail for a long time.

        • (Score: 0) by Anonymous Coward on Tuesday November 19 2019, @09:17AM (1 child)

          by Anonymous Coward on Tuesday November 19 2019, @09:17AM (#921874)

          That's not really an edge case, there are no edge cases. You go to jail and the kid gets tried as an adult for producing child porn of themselves and goes to jail. There was a case of two underage kids sending each other pics of themselves and they both were charged as adults for possession and production.

          Then there's cases of media of people over 24 who look younger being considered child porn as well (not sure if that was USA or not). Basically if it looks like it might be child porn by any definition then it is child porn. Your baby photos included if your local police don't like you.

          • (Score: 2) by dry on Tuesday November 19 2019, @04:03PM

            by dry (223) on Tuesday November 19 2019, @04:03PM (#921960) Journal

            It depends on jurisdiction. I know where I am, sexting is not considered child porn unless widely shared. I think the 24 yr old thing was only in Australia, and of course the age of consent varies, which can lead to be being charged for viewing child porn when the people involved were of legal age where they shot the video.
            The whole thing is crazy, treating young women or even drawings the same as little kids is just one example.

  • (Score: 2) by Runaway1956 on Monday November 18 2019, @03:56AM

    by Runaway1956 (2926) Subscriber Badge on Monday November 18 2019, @03:56AM (#921415) Journal

    Because the company couldn't view or possess the imagery itself, it had to send its software to Canadian officials, who would run the training system on the illegal images and report back the results.

    Ministy of Silly Pedo Detection maybe?

  • (Score: 5, Interesting) by Appalbarry on Monday November 18 2019, @07:14AM (1 child)

    by Appalbarry (66) on Monday November 18 2019, @07:14AM (#921447) Journal

    Kind of amazing how all of these companies that can't possibly manage to block kiddie porn were able to flag and remove any and all breast-feeding photos on Facebook.

    • (Score: 2) by rigrig on Monday November 18 2019, @10:11AM

      by rigrig (5129) <soylentnews@tubul.net> on Monday November 18 2019, @10:11AM (#921465) Homepage

      Those aren't illegal, so they can just add them to Mark's private collection and train the AI from that.

      --
      No one remembers the singer.
  • (Score: -1, Troll) by Anonymous Coward on Monday November 18 2019, @08:12AM (1 child)

    by Anonymous Coward on Monday November 18 2019, @08:12AM (#921449)

    Cartoons are satanic videos containing sexually explicit imagery intended to teach sexuality to children at the earliest age. So as to get them to agree to sexual encounters with devil-worshiping khazar jewish rats and their friends. These khazar creatures (and their friends) have no human morals and cannot imagine being a human. They will plot against us using everything available to them in order to destroy us.

    Do not let your children watch cartoons or television. These things are not natural.

  • (Score: 2) by jmichaelhudsondotnet on Monday November 18 2019, @04:44PM

    by jmichaelhudsondotnet (8122) on Monday November 18 2019, @04:44PM (#921567) Journal

    In order for programmers to train this, don't they need a large collection of......

    This means though that somewhere these companies may be trying to pass a law that lets them under some circumstances break cp laws, for the children.

    Or am I missing something?

    My point is that a private company cannot do this ethically and that a government would have a very difficult time doing so.

    And since epstein could not be stopped by our government, we have the worst of all worlds. And as if google could be trusted to enforce something like this uniformly and without corruption, lol.

    They lied about tracking android maps, they have no credibility and what this issue needs more than anything is credibility.

    Otherwise this is just an 'arrest all our political enemies' button, which they will eventually use.

  • (Score: 2) by Bot on Monday November 18 2019, @10:17PM

    by Bot (3902) on Monday November 18 2019, @10:17PM (#921711) Journal

    1. train AI to detect nudity and sex
    2. train other AI to detect youngsters
    3. pipe the detected set of one through the second
    4. no ???
    5. PROFIT!!!

    --
    Account abandoned.
(1)