Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by cmn32480 on Friday November 10 2017, @04:49PM   Printer-friendly
from the hackers-paradise dept.

Facebook to Fight Revenge Porn by Letting Potential Victims Upload Nudes in Advance

Submitted via IRC for TheMightyBuzzard

This new protection system works similar to the anti-child-porn detection systems in use at Facebook, and other social media giants like Google, Twitter, Instagram, and others.

It works on a database of file hashes, a cryptographic signature computed for each file.

Facebook says that once an abuser tries to upload an image marked as "revenge porn" in its database, its system will block the upload process. This will work for images shared on the main Facebook service, but also for images shared privately via Messenger, Facebook's IM app. Potential victims will need to upload nude photos of themselves

The weird thing is that in order to build a database of "revenge porn" file hashes, Facebook will rely on potential victims uploading a copy of the nude photo in advance.

This process involves the victim sending a copy of the nude photo to his own account, via Facebook Messenger. This implies uploading a copy of the nude photo on Facebook Messenger, the very same act the victim is trying to prevent.

The victim can then report the photo to Facebook, which will create a hash of the image that the social network will use to block further uploads of the same photo.

This is possible because in April this year, Facebook modified its image reporting process to take into account images showing "revenge porn" acts.

Facebook says it's not storing a copy of the photo, but only computing the file's hash and adding it to its database of revenge porn imagery.

Victims who fear that former or current partners may upload a nude photo online can pro-actively take this step to block the image from ever being uploaded on Facebook and shared among friends.

We won't be doing this. I don't even want to see hashes of you folks naked.

Source: https://www.bleepingcomputer.com/news/technology/facebook-to-fight-revenge-porn-by-letting-potential-victims-upload-nudes-in-advance/

Facebook asks Australians to send nude photos, for safety

"Worried that an ex-boyfriend or girlfriend might post your intimate photos on the internet? Facebook says it has a solution – as long as you'll hand over the photos first.

The social media giant recently announced its new plan to combat "revenge porn," when individuals post nude photos online without the consent of the subject." http://www.foxnews.com/tech/2017/11/08/facebook-says-it-needs-your-explicit-photos-to-combat-revenge-porn.html


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by takyon on Friday November 10 2017, @05:05PM (20 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @05:05PM (#595195) Journal

    Look at my links below.

    Will they restrict uploads to 18-year and older uploaders? What about all the teen accounts that set their age as 90?

    Facebook has an existing "facial" recognition algorithm. It seems like they should be able to alter that to block nude photos automatically (the ones that are personally identifiable anyway).

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by DannyB on Friday November 10 2017, @05:15PM (3 children)

    by DannyB (5839) Subscriber Badge on Friday November 10 2017, @05:15PM (#595200) Journal

    If Facebook doesn't trust their algorithms to recognize revenge pr0n, then they probably don't trust them to identify that this person doesn't look 90 years old.

    As for automatically recognizing nude humans, Facebook will end up with quite a training data set if people actually do submit plenty of photos.

    At some point, it would become unnecessary to have humans vetting the photos to ensure they are actually potential revenge pr0n or blackmail material.

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 2) by takyon on Friday November 10 2017, @05:17PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @05:17PM (#595201) Journal

      At some point, it would become unnecessary to have humans vetting the photos to ensure they are actually potential revenge pr0n or blackmail material.

      Unnecessary, yet highly stimulating.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:24PM

      by Anonymous Coward on Friday November 10 2017, @06:24PM (#595251)

      The problem is that they still have to retain a copy of the materials in case there are court proceedings and the stuff involving minors is illegal to posses. So, they wouldn't be able to have the images in order to verify that the images are illegal and as a result, there's a huge hole that could be used to shut down just about anybody's account by processing images through this algorithm.

      And since things that get flagged as child porn tend to get automatically blocked until proven to not be, it could easily wind up being abused if FB hasn't though the process through.

    • (Score: 3, Insightful) by frojack on Friday November 10 2017, @07:08PM

      by frojack (1554) on Friday November 10 2017, @07:08PM (#595280) Journal

      Facebook will end up with quite a training data set if people actually do submit plenty of photos.

      With every single Facebook user busily adding photos of friends and family and casual acquaintances, complete with names and email addresses, they don't need nudes to help them.

      They've already got it figured out, and so does Apple, and Google. People just can't seem to help themselves from building skynet and joining the borg. Facebook conspired with Intel to build Intel's new Nirvana chipset [intelnervana.com] for this kind of thing. Instant facial reco and doxing anywhere anytime.

      People just can't seem to step away from their own camera, so I suspect Facebook will find a lot of takers for this offer.

      --
      No, you are mistaken. I've always had this sig.
  • (Score: 3, Touché) by DannyB on Friday November 10 2017, @05:18PM

    by DannyB (5839) Subscriber Badge on Friday November 10 2017, @05:18PM (#595202) Journal

    What about all the teen accounts that set their age as 90?

    The teens will be arrested and prosecuted for distribution of child pr0n.

    Facebook, the poor victim, will be asked to provide copies in triplicate, to police, as evidence.

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
  • (Score: 3, Interesting) by JNCF on Friday November 10 2017, @05:23PM (12 children)

    by JNCF (4317) on Friday November 10 2017, @05:23PM (#595208) Journal

    I'm confused as to why they need to upload the image at all, they could have the hash computed in the user's browser and just upload that. With this scheme, there shouldn't be any reason to restrict minors from partaking. Maybe the images need to be reviewed by a human before hashing to ensure that the system isn't flooded with hashes of banal data that would stop, say, political memes from spreading? That filtering could also happen after a match is detected, but it would then at least delay the banal data from being posted, encouraging users to keep poisoning their well with phony selfies.

    • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @05:34PM (4 children)

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @05:34PM (#595215) Homepage Journal

      Yep. That's the downside of client-side computing. You can't trust the data processing. Done server-side you could just let anything with a fleshtones:non-fleshtones ratio above N through to the hashing function and feel relatively certain it's worth blocking that hash.

      --
      My rights don't end where your fear begins.
      • (Score: 2) by JNCF on Friday November 10 2017, @05:40PM (2 children)

        by JNCF (4317) on Friday November 10 2017, @05:40PM (#595218) Journal

        Monochrome revenge porn?

        • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @06:09PM (1 child)

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @06:09PM (#595239) Homepage Journal

          Yeah. It'd be much more difficult than text filters for certain. Low hanging fruit would be possible for a bit at least though.

          --
          My rights don't end where your fear begins.
          • (Score: 0) by Anonymous Coward on Friday November 10 2017, @10:24PM

            by Anonymous Coward on Friday November 10 2017, @10:24PM (#595378)

            but only for the old and the well-endowed.

      • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:28PM

        by Anonymous Coward on Friday November 10 2017, @06:28PM (#595257)

        It's not just that. There's also the issue of due diligence. If FB doesn't have a copy, the original could be anything at all. They also wouldn't have any way of training an AI to verify that the things being uploaded are likely to be illegal or harassment rather than just images that somebody is trolling with to get people locked out.

        I get the impetus for this, but the technical and legal challenges in doing this are probably a lot more complicated than it might seem.

    • (Score: 2) by frojack on Friday November 10 2017, @08:07PM (4 children)

      by frojack (1554) on Friday November 10 2017, @08:07PM (#595310) Journal

      why they need to upload the image at all, they could have the hash computed in the user's browser

      Doubt this would be effective.

      A simple dimension change, cropping, color shift, or photo-shop would outsmart the hash every time.

      Go to images.google.com and drag and drop any image you may happen to have on your computer or from any web site, and google will try to match it. Its pretty hopeless, and if the image is a face or body, and even if that exact image is readily available on the web.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 3, Interesting) by JNCF on Friday November 10 2017, @08:13PM

        by JNCF (4317) on Friday November 10 2017, @08:13PM (#595313) Journal

        This is an issue whether it's done on the client or the server, it's not relevant to the quote. An AC below suggests that they'll probably divide the image so subsets of it can be hashed. I could see this helping for corners (if some, but not all, edges were cropped), or for sections around something easily identifiable like a face, but it wouldn't help with resizing, color changing, compression, etc.

      • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:14AM (2 children)

        by Anonymous Coward on Saturday November 11 2017, @04:14AM (#595489)

        > A simple dimension change, cropping, color shift, or photo-shop would outsmart the hash every time.

        Play around with TinEye.com -- it does a surprisingly good job of matching images at different resolutions and different crops, even parody images that start with the same base and have something (mustache?) photoshopped in. Often by looking at the posting dates for the images it returns, it is possible to find out the origin of a particular image.

        While I don't use FB, if I did it would be handy to be able to send them hashes (or whatever) of images that I didn't want to see uploaded by anyone else. I'd even be willing to download their executable and generate the hashes locally before uploading. Or maybe the hashing could be built into some simple utility like EzThumbs.exe ?

        • (Score: 2) by JNCF on Saturday November 11 2017, @04:21PM (1 child)

          by JNCF (4317) on Saturday November 11 2017, @04:21PM (#595624) Journal

          But TinEye has both images to compare at the same time, not one image and the hash of another image. I don't see how their method would apply here, unless facebook actually does retain the uploaded image.

          • (Score: 2) by JNCF on Saturday November 11 2017, @05:00PM

            by JNCF (4317) on Saturday November 11 2017, @05:00PM (#595636) Journal

            On second thought, maybe facebook could be looking for features of an image (relations between curves, etc.) that I don't fully understand, hashing that mess as individual pieces, and comparing those hashes to the hashes of the the features gleaned from the potential revenge porn. I doubt TinEye reexamines its whole database everytime it searches for an image. You might be onto something.

    • (Score: 3, Interesting) by takyon on Friday November 10 2017, @10:29PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @10:29PM (#595381) Journal

      Here's what they could do without involving any user action. Use a nudity detection algorithm, which I'm sure they are already running:

      https://lizrush.gitbooks.io/algorithms-for-webdevs-ebook/content/chapters/nudity-detection.html [gitbooks.io]
      https://blog.algorithmia.com/improving-nudity-detection-nsfw-image-recognition/ [algorithmia.com]

      Then use facial recognition on the image. If the nude person(s) can be identified, prevent upload if they have not changed an opt-in setting to allow nudes of them to be posted.

      If a person uploads one of their own nudes, that is a good time for Facebook to throw a warning about the dangers of posting your tits online, or just confirming that you are uploading the correct image, and then have a button that says "Opt me in!"

      This could definitely cut down on some of the activity because Facebook has been using facial recognition to detect people in photos for years. I'm sure it has gotten more accurate since it debuted. Of course, once the system is in place, people can just upload the nude photos to another site. And at that point, it is no longer Facebook's problem.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by DannyB on Monday November 13 2017, @04:27PM

      by DannyB (5839) Subscriber Badge on Monday November 13 2017, @04:27PM (#596262) Journal

      why they need to upload the image at all, they could have the hash computed in the user's browser and just upload that.

      Let's use a hypothetical example.

      Let's suppose there is this person, Killary Flinton. Killary uploads lots of picture hashes to block these pictures from ever appearing on FaceTwit. Killary claims these are embarrassing pictures taken earlier in life when Killary wanted to prove having larger testicles than any other candidate. FaceTwit accepts Killary's explanation of why these pictures should never appear on FaceTwit.

      Now there is this other person Ronald Rump.

      It turns out that the picture hashes Killary uploaded are actually pictures of Ronald Rump that Killary want to block. Killary's diabolical plan to rule the world is that Killary can get elected by blocking all possible imagery of Rump.

      Therefore humans at facebook need to see the naked pictures:
      1. in order to prevent abuse
      2. looking at hashes is not as gratifying as looking at pictures

      Hope that helps. :-)

      --
      To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
  • (Score: 2) by JNCF on Friday November 10 2017, @05:25PM

    by JNCF (4317) on Friday November 10 2017, @05:25PM (#595210) Journal

    Reading your link below, yes, they review beforehand. I guess no children, then.

  • (Score: 1, Insightful) by Anonymous Coward on Friday November 10 2017, @05:55PM

    by Anonymous Coward on Friday November 10 2017, @05:55PM (#595228)

    They can use this technology to censor any person a government or corporation doesn't want to exist.

    Any photo you upload to try and bring attention to the person is just poof gone.

    A few years and they will be able to simply insert an alternative person or generic standin as needed to replace someone.

    Technological censorship and revisionism is about to be taken to a whole other level.