Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 14 submissions in the queue.
posted by cmn32480 on Friday November 10 2017, @04:49PM   Printer-friendly
from the hackers-paradise dept.

Facebook to Fight Revenge Porn by Letting Potential Victims Upload Nudes in Advance

Submitted via IRC for TheMightyBuzzard

This new protection system works similar to the anti-child-porn detection systems in use at Facebook, and other social media giants like Google, Twitter, Instagram, and others.

It works on a database of file hashes, a cryptographic signature computed for each file.

Facebook says that once an abuser tries to upload an image marked as "revenge porn" in its database, its system will block the upload process. This will work for images shared on the main Facebook service, but also for images shared privately via Messenger, Facebook's IM app. Potential victims will need to upload nude photos of themselves

The weird thing is that in order to build a database of "revenge porn" file hashes, Facebook will rely on potential victims uploading a copy of the nude photo in advance.

This process involves the victim sending a copy of the nude photo to his own account, via Facebook Messenger. This implies uploading a copy of the nude photo on Facebook Messenger, the very same act the victim is trying to prevent.

The victim can then report the photo to Facebook, which will create a hash of the image that the social network will use to block further uploads of the same photo.

This is possible because in April this year, Facebook modified its image reporting process to take into account images showing "revenge porn" acts.

Facebook says it's not storing a copy of the photo, but only computing the file's hash and adding it to its database of revenge porn imagery.

Victims who fear that former or current partners may upload a nude photo online can pro-actively take this step to block the image from ever being uploaded on Facebook and shared among friends.

We won't be doing this. I don't even want to see hashes of you folks naked.

Source: https://www.bleepingcomputer.com/news/technology/facebook-to-fight-revenge-porn-by-letting-potential-victims-upload-nudes-in-advance/

Facebook asks Australians to send nude photos, for safety

"Worried that an ex-boyfriend or girlfriend might post your intimate photos on the internet? Facebook says it has a solution – as long as you'll hand over the photos first.

The social media giant recently announced its new plan to combat "revenge porn," when individuals post nude photos online without the consent of the subject." http://www.foxnews.com/tech/2017/11/08/facebook-says-it-needs-your-explicit-photos-to-combat-revenge-porn.html


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by JNCF on Friday November 10 2017, @05:23PM (12 children)

    by JNCF (4317) on Friday November 10 2017, @05:23PM (#595208) Journal

    I'm confused as to why they need to upload the image at all, they could have the hash computed in the user's browser and just upload that. With this scheme, there shouldn't be any reason to restrict minors from partaking. Maybe the images need to be reviewed by a human before hashing to ensure that the system isn't flooded with hashes of banal data that would stop, say, political memes from spreading? That filtering could also happen after a match is detected, but it would then at least delay the banal data from being posted, encouraging users to keep poisoning their well with phony selfies.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @05:34PM (4 children)

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @05:34PM (#595215) Homepage Journal

    Yep. That's the downside of client-side computing. You can't trust the data processing. Done server-side you could just let anything with a fleshtones:non-fleshtones ratio above N through to the hashing function and feel relatively certain it's worth blocking that hash.

    --
    My rights don't end where your fear begins.
    • (Score: 2) by JNCF on Friday November 10 2017, @05:40PM (2 children)

      by JNCF (4317) on Friday November 10 2017, @05:40PM (#595218) Journal

      Monochrome revenge porn?

      • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @06:09PM (1 child)

        by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @06:09PM (#595239) Homepage Journal

        Yeah. It'd be much more difficult than text filters for certain. Low hanging fruit would be possible for a bit at least though.

        --
        My rights don't end where your fear begins.
        • (Score: 0) by Anonymous Coward on Friday November 10 2017, @10:24PM

          by Anonymous Coward on Friday November 10 2017, @10:24PM (#595378)

          but only for the old and the well-endowed.

    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:28PM

      by Anonymous Coward on Friday November 10 2017, @06:28PM (#595257)

      It's not just that. There's also the issue of due diligence. If FB doesn't have a copy, the original could be anything at all. They also wouldn't have any way of training an AI to verify that the things being uploaded are likely to be illegal or harassment rather than just images that somebody is trolling with to get people locked out.

      I get the impetus for this, but the technical and legal challenges in doing this are probably a lot more complicated than it might seem.

  • (Score: 2) by frojack on Friday November 10 2017, @08:07PM (4 children)

    by frojack (1554) on Friday November 10 2017, @08:07PM (#595310) Journal

    why they need to upload the image at all, they could have the hash computed in the user's browser

    Doubt this would be effective.

    A simple dimension change, cropping, color shift, or photo-shop would outsmart the hash every time.

    Go to images.google.com and drag and drop any image you may happen to have on your computer or from any web site, and google will try to match it. Its pretty hopeless, and if the image is a face or body, and even if that exact image is readily available on the web.

    --
    No, you are mistaken. I've always had this sig.
    • (Score: 3, Interesting) by JNCF on Friday November 10 2017, @08:13PM

      by JNCF (4317) on Friday November 10 2017, @08:13PM (#595313) Journal

      This is an issue whether it's done on the client or the server, it's not relevant to the quote. An AC below suggests that they'll probably divide the image so subsets of it can be hashed. I could see this helping for corners (if some, but not all, edges were cropped), or for sections around something easily identifiable like a face, but it wouldn't help with resizing, color changing, compression, etc.

    • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:14AM (2 children)

      by Anonymous Coward on Saturday November 11 2017, @04:14AM (#595489)

      > A simple dimension change, cropping, color shift, or photo-shop would outsmart the hash every time.

      Play around with TinEye.com -- it does a surprisingly good job of matching images at different resolutions and different crops, even parody images that start with the same base and have something (mustache?) photoshopped in. Often by looking at the posting dates for the images it returns, it is possible to find out the origin of a particular image.

      While I don't use FB, if I did it would be handy to be able to send them hashes (or whatever) of images that I didn't want to see uploaded by anyone else. I'd even be willing to download their executable and generate the hashes locally before uploading. Or maybe the hashing could be built into some simple utility like EzThumbs.exe ?

      • (Score: 2) by JNCF on Saturday November 11 2017, @04:21PM (1 child)

        by JNCF (4317) on Saturday November 11 2017, @04:21PM (#595624) Journal

        But TinEye has both images to compare at the same time, not one image and the hash of another image. I don't see how their method would apply here, unless facebook actually does retain the uploaded image.

        • (Score: 2) by JNCF on Saturday November 11 2017, @05:00PM

          by JNCF (4317) on Saturday November 11 2017, @05:00PM (#595636) Journal

          On second thought, maybe facebook could be looking for features of an image (relations between curves, etc.) that I don't fully understand, hashing that mess as individual pieces, and comparing those hashes to the hashes of the the features gleaned from the potential revenge porn. I doubt TinEye reexamines its whole database everytime it searches for an image. You might be onto something.

  • (Score: 3, Interesting) by takyon on Friday November 10 2017, @10:29PM

    by takyon (881) <{takyon} {at} {soylentnews.org}> on Friday November 10 2017, @10:29PM (#595381) Journal

    Here's what they could do without involving any user action. Use a nudity detection algorithm, which I'm sure they are already running:

    https://lizrush.gitbooks.io/algorithms-for-webdevs-ebook/content/chapters/nudity-detection.html [gitbooks.io]
    https://blog.algorithmia.com/improving-nudity-detection-nsfw-image-recognition/ [algorithmia.com]

    Then use facial recognition on the image. If the nude person(s) can be identified, prevent upload if they have not changed an opt-in setting to allow nudes of them to be posted.

    If a person uploads one of their own nudes, that is a good time for Facebook to throw a warning about the dangers of posting your tits online, or just confirming that you are uploading the correct image, and then have a button that says "Opt me in!"

    This could definitely cut down on some of the activity because Facebook has been using facial recognition to detect people in photos for years. I'm sure it has gotten more accurate since it debuted. Of course, once the system is in place, people can just upload the nude photos to another site. And at that point, it is no longer Facebook's problem.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by DannyB on Monday November 13 2017, @04:27PM

    by DannyB (5839) Subscriber Badge on Monday November 13 2017, @04:27PM (#596262) Journal

    why they need to upload the image at all, they could have the hash computed in the user's browser and just upload that.

    Let's use a hypothetical example.

    Let's suppose there is this person, Killary Flinton. Killary uploads lots of picture hashes to block these pictures from ever appearing on FaceTwit. Killary claims these are embarrassing pictures taken earlier in life when Killary wanted to prove having larger testicles than any other candidate. FaceTwit accepts Killary's explanation of why these pictures should never appear on FaceTwit.

    Now there is this other person Ronald Rump.

    It turns out that the picture hashes Killary uploaded are actually pictures of Ronald Rump that Killary want to block. Killary's diabolical plan to rule the world is that Killary can get elected by blocking all possible imagery of Rump.

    Therefore humans at facebook need to see the naked pictures:
    1. in order to prevent abuse
    2. looking at hashes is not as gratifying as looking at pictures

    Hope that helps. :-)

    --
    Stupid people exist because nothing in the food chain eats them anymore.