Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Monday August 10 2015, @07:01PM   Printer-friendly
from the thinking-of-the-children dept.

The BBC reports that the UK-based Internet Watch Foundation is sharing hash lists with Google, Facebook, and Twitter to prevent the upload of child abuse imagery:

Web giants Google, Facebook and Twitter have joined forces with a British charity in a bid to remove millions of indecent child images from the net. In a UK first, anti-abuse organisation Internet Watch Foundation (IWF) has begun sharing lists of indecent images, identified by unique "hash" codes. Wider use of the photo-tagging system could be a "game changer" in the fight against paedophiles, the charity said. Internet security experts said images on the "darknet" would not be detected.

The IWF, which works to take down indecent images of children, allocates to each picture it finds a "hash" - a unique code, sometimes referred to as a digital finger-print. By sharing "hash lists" of indecent pictures of children, Google, Facebook and Twitter will be able to stop those images from being uploaded to their sites.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by Anonymous Coward on Monday August 10 2015, @08:59PM

    by Anonymous Coward on Monday August 10 2015, @08:59PM (#220902)

    > There should really be four hashes per image (original image, and rotated 90, 180, and 270 degrees), so that this can't be avoided by simply rotating the image,

    That's not how these kinds of hashes work. These aren't "hashes" in the basic, most literal form that you learned comp-sci 200.

    They are more like the way google images lets you search for similar images - it can pick out any degree of rotation, all kinds of geometric distortions, cropping, embedded in larger images, color manipuilation, etc.

    Here's a wikipedia article on one such system, Microsoft's PhotoDNA. [wikipedia.org]

    Starting Score:    0  points
    Moderation   +3  
       Insightful=1, Interesting=1, Informative=1, Total=3
    Extra 'Interesting' Modifier   0  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Monday August 10 2015, @09:41PM

    by Anonymous Coward on Monday August 10 2015, @09:41PM (#220923)

    Yeah, right, that's going to work so much better than the *other* google algorithm for automagically understanding pictures:

    https://soylentnews.org/article.pl?sid=15/06/30/1817241 [soylentnews.org]

  • (Score: 2) by FatPhil on Tuesday August 11 2015, @06:53AM

    by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Tuesday August 11 2015, @06:53AM (#221138) Homepage
    THat's looks like spammy publicity material, not encyclopedic material.
    E.g. "PhotoDNA is primarily used in the prevention of child pornography" - citation needed - name one case where the software has *prevented* child pornography. Nothing that only detects can prevent. And one must also remember it detects nothing that the human eye couldn't easily detect with better accuracy.

    This is an evolutionary step which the gazelles will win, as it's *so easily* worked around.

    Until they make password-protected zip files illegal. Expect legislation soon...
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves