Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Friday November 10 2017, @04:49PM   Printer-friendly
from the hackers-paradise dept.

Facebook to Fight Revenge Porn by Letting Potential Victims Upload Nudes in Advance

Submitted via IRC for TheMightyBuzzard

This new protection system works similar to the anti-child-porn detection systems in use at Facebook, and other social media giants like Google, Twitter, Instagram, and others.

It works on a database of file hashes, a cryptographic signature computed for each file.

Facebook says that once an abuser tries to upload an image marked as "revenge porn" in its database, its system will block the upload process. This will work for images shared on the main Facebook service, but also for images shared privately via Messenger, Facebook's IM app. Potential victims will need to upload nude photos of themselves

The weird thing is that in order to build a database of "revenge porn" file hashes, Facebook will rely on potential victims uploading a copy of the nude photo in advance.

This process involves the victim sending a copy of the nude photo to his own account, via Facebook Messenger. This implies uploading a copy of the nude photo on Facebook Messenger, the very same act the victim is trying to prevent.

The victim can then report the photo to Facebook, which will create a hash of the image that the social network will use to block further uploads of the same photo.

This is possible because in April this year, Facebook modified its image reporting process to take into account images showing "revenge porn" acts.

Facebook says it's not storing a copy of the photo, but only computing the file's hash and adding it to its database of revenge porn imagery.

Victims who fear that former or current partners may upload a nude photo online can pro-actively take this step to block the image from ever being uploaded on Facebook and shared among friends.

We won't be doing this. I don't even want to see hashes of you folks naked.

Source: https://www.bleepingcomputer.com/news/technology/facebook-to-fight-revenge-porn-by-letting-potential-victims-upload-nudes-in-advance/

Facebook asks Australians to send nude photos, for safety

"Worried that an ex-boyfriend or girlfriend might post your intimate photos on the internet? Facebook says it has a solution – as long as you'll hand over the photos first.

The social media giant recently announced its new plan to combat "revenge porn," when individuals post nude photos online without the consent of the subject." http://www.foxnews.com/tech/2017/11/08/facebook-says-it-needs-your-explicit-photos-to-combat-revenge-porn.html


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by looorg on Friday November 10 2017, @06:05PM (9 children)

    by looorg (578) on Friday November 10 2017, @06:05PM (#595234)

    Facebook says it's not storing a copy of the photo, but only computing the file's hash and adding it to its database of revenge porn imagery.

    So they are hashing streams now? Totally not storing the image even temporarily and then deleting it right after they are done. Cause deleting content is what Facebook is know for ... If sharing all your filthy amateur p0rn with Facebook to protect you from Revenge p0rn from all the creepy dudes (or ladies) you have been banging doesn't sound like a trap I don't know what ever will.

    Victims who fear that former or current partners may upload a nude photo online can pro-actively take this step

    Perhaps they should just integrate that into the phones photo-app. "Hi! Clippy-the-Pimp has detected a lot of fleshtones in your picture and my AI sub-processor has deemed the content lewd. A copy has been automagically CC:ed to the Facebook cloud-anti-revenge-p0rn-hash-library to keep you safe!"

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Runaway1956 on Friday November 10 2017, @06:22PM (6 children)

    by Runaway1956 (2926) Subscriber Badge on Friday November 10 2017, @06:22PM (#595250) Journal

    "only computing the file's hash"

    So, how much does a photo need to change, before it generates a different hash? A little cropping, for a closeup? Do a redeye filter on it? Change a few pixels in a corner? Hide an encrypted file in it? Maybe just strip the metadata?

    I wonder how much their little scheme relies on metadata, itself? "We've got a match here John!! Same time, date, and GPS coordinates as these six photos!!" (oversimplified, of course, but the algorithms may well be using that metadata)

    • (Score: 1, Interesting) by Anonymous Coward on Friday November 10 2017, @06:34PM (2 children)

      by Anonymous Coward on Friday November 10 2017, @06:34PM (#595262)

      Presumably, it's not just one hash, it's a half-dozen or more done with varying degrees of filtering. If you divide an image up into 8 pieces, it's hard to change the original so that it doesn't have the same relationship between the samples. The downside is that if it's only 8 there are going to be a ton of images that match. You can increase that number to something larger like 128 or more, in which case it'll be more susceptible to this kind of manipulation. You can also compare individual samples between the two.

      There's other things you can do like taking a high pass filter of the two images and comparing those checksums together. It's fairly hard to prevent those from not coming out similarly without massively altering the original.

      And ultimately, you don't necessarily have to make it a 100% match, you just having false positives that somebody has to manually review is not necessarily a problem if the risk is relatively unusual and the image remains in place pending review.

      You're unlikely to ever completely solve the problem without manually comparing images, but just making it less convenient to post will likely cut down on the sharing that goes on.

      • (Score: 1) by khallow on Friday November 10 2017, @08:17PM (1 child)

        by khallow (3766) Subscriber Badge on Friday November 10 2017, @08:17PM (#595316) Journal
        The point of a hash is that close is not good enough. With the scheme described in this thread, it sounds like you can recreate the picture by playing a game of warmer/colder (the nudie will be the near unique result with nice gradients).
        • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @06:34AM

          by Anonymous Coward on Saturday November 11 2017, @06:34AM (#595515)

          And having to have a new hash to cover every single pixel change on every image on FB would easily overwhelm the storage capacity of every storage device on the planet.

          Humans can identify an image as being the same despite minor changes a traditional hash isn't going to handle that very well without something being done to normalize the data. Chunking and blurring removes a lot off minor variation without using the processing power of more in depth analysis.

          The point is that rather than comparing an image against all of the images you can compare against just the ones sharing the abbreviated match. Then you can do something more resource intensive to verify or rule out the match.

          Just comparing checksums would be a waste of resources with an unforgivable number of false negatives.

    • (Score: 2) by looorg on Friday November 10 2017, @06:49PM

      by looorg (578) on Friday November 10 2017, @06:49PM (#595270)

      Since they don't really, from what I have seen so far, go into any great depth about how this revenge-p0rn protection scheme will work it's quite hard to know. Will they even store metadata? Will there just be one hash per image or will they run a few different once? Considering the amount of people on earth that apparently also like to take pictures of themselves showing tits, dicks and all things between there should be a fairly large amount of these created on a daily basis. It shouldn't take to much time until you run into one of them hashing paradoxes and there is a collision and you have two or more images that generate the same hash (two files one hash ...). That might not matter all that much tho in this case since they would just be blocking a few more p0rn images then they intended. There might be an issue if they want to notify the original uploader that they have been shared and the system automatically messages two different people to tell them about it.

    • (Score: 4, Funny) by JNCF on Friday November 10 2017, @07:07PM

      by JNCF (4317) on Friday November 10 2017, @07:07PM (#595276) Journal

      So, how much does a photo need to change, before it generates a different hash?

      Just a bit.

    • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:35AM

      by Anonymous Coward on Saturday November 11 2017, @04:35AM (#595496)

      See comments elsewhere about TinEye.com -- it can match images with all kinds of changes, including parodies with photoshopping. I'm assuming FB will do something like this, maybe just buy the TinEye technology, lord knows they could afford it.

  • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:29PM (1 child)

    by Anonymous Coward on Friday November 10 2017, @06:29PM (#595259)

    Facebook says it's not storing a copy of the photo, but only computing the file's hash...

    It depends entirely on the article you read. The first one I read said this. A later one confirmed that they retain the photos, but blur them, and share them with select people. A later one said that they won't blur them at all and staff will definitely be looking at everything.

    Bottom line, Facebook wants to become a hub of amateur pr0n the same way they've become the envy of the NSA for personal information.

    • (Score: 2) by HiThere on Friday November 10 2017, @07:08PM

      by HiThere (866) Subscriber Badge on Friday November 10 2017, @07:08PM (#595279) Journal

      Maybe they're planning on using them as input to their new AI image generator...everything guaranteed artificial, but lifelike.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.