Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday November 18 2019, @02:43AM   Printer-friendly
from the problem-of-our-own-making dept.

Submitted via IRC for SoyCow1337

How Laws Against Child Sexual Abuse Imagery Can Make It Harder to Detect

Child sexual abuse photos and videos are among the most toxic materials online. It is against the law to view the imagery, and anybody who comes across it must report it to the federal authorities.

So how can tech companies, under pressure to remove the material, identify newly shared photos and videos without breaking the law? They use software — but first they have to train it, running repeated tests to help it accurately recognize illegal content.

Google has made progress, according to company officials, but its methods have not been made public. Facebook has, too, but there are still questions about whether it follows the letter of the law. Microsoft, which has struggled to keep known imagery off its search engine, Bing, is frustrated by the legal hurdles in identifying new imagery, a spokesman said.

The three tech giants are among the few companies with the resources to develop artificial intelligence systems to take on the challenge. One route for the companies is greater cooperation with the federal authorities, including seeking permission to keep new photos and videos for the purposes of developing the detection software.

But that approach runs into a larger privacy debate involving the sexual abuse material: How closely should tech companies and the federal government work to shut it down? And what would prevent their cooperation from extending to other online activity?

Paul Ohm, a former prosecutor in the Justice Department's computer crime and intellectual property section, said the laws governing child sexual abuse imagery were among the "fiercest criminal laws" on the books.

"Just the simple act of shipping the images from one A.I. researcher to another is going to implicate you in all kinds of federal crimes," he said.

[...] Companies in other countries are facing similar hurdles. Two Hat Security in Canada, for instance, spent years working with the authorities there to develop a system that detects child sexual abuse imagery. Because the company couldn't view or possess the imagery itself, it had to send its software to Canadian officials, who would run the training system on the illegal images and report back the results. The company would then fine-tune the software and send it back for another round of training.

The system has been in development for three to four years, said the company's chief executive, Chris Priebe.

"It's a slow process," he said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: -1, Troll) by Anonymous Coward on Monday November 18 2019, @03:38AM

    by Anonymous Coward on Monday November 18 2019, @03:38AM (#921412)

    The problem is NOT the published content, it's where they are getting the individuals to produce it. If you want to see what the 'Deep State' looks like, go look at it from enabling pedophilia and other sexually abusive activities. It is closer than you think. At your children's school, at your local police station, FBI branch, at those 'VIP parties' you never hear about directly but know they have, at camps, group activities like scouts, religion centers, sports activities.

    Epstein doesn't scratch the surface. All the 'Think of the Children' legislation is intended for the opposite effect: creating more sheep they can easily herd into their pens of depravity. Having spoken with individuals who went through this, most of the people are both ruthless and careful, have a network that help them stay out of the headlines because people who are too powerful to be named are involved. For those who don't remember, one of Epstein's clients was a Prince of the UK Royal Family.

    All this shit is doing is helping throw some lone perverts in the headlines to pretend they are doing something, most of the time the ones you don't have to worry about, or the ones who fell out of favor. Epstein's blackmail materials were an example, as was the fire that later consumed them.

    If you want to make an actual difference, start going and looking at activities your child may be involved in. Look at people you presume to trust without question in their lives. Look at the people viewing competitions and events you let them participate in. If you have chosen the wrong ones you will start noticing the tells even if you can't end up in the situation to prove them.

    Published child abuse imagery can be damaging, but it is the unpublished material which is used to 'trap' participants so they won't speak out. And once they are too far in they can't or won't take the risks to come back to a more normal life, even if they agree it should not happen to others.

    You can be naive and pretend this will make a difference, or go do what is necessary to discover for yourself and shine light on the darkness of the world, but understand the politicians and these laws are only meant to help throw shade on their own activities while stripping rights from you for their own current and future benefit while making it easier for them to find prey. Teach your own children where the limits on respect of authority are, and what sort of abuses predatory authority figures may attempt to use on them or their peers. Also, don't shine light on a single illicit relationship until you've made sure there aren't bigger cockroaches nearby waiting to scurry away from the light before you can snare them with a trap.

    As a final thought: Ask yourself what made all those Olympic competitors prone to so much sexual activity and what sort of grooming may have gone on during all those years of training to become the best of the best for their narrow realm of competition. Olympic training isn't cheap after all, and patrons can be few and far between...

    Starting Score:    0  points
    Moderation   -1  
       Troll=2, Insightful=1, Total=3
    Extra 'Troll' Modifier   0  

    Total Score:   -1