Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 19 submissions in the queue.
posted by cmn32480 on Friday November 10 2017, @04:49PM   Printer-friendly
from the hackers-paradise dept.

Facebook to Fight Revenge Porn by Letting Potential Victims Upload Nudes in Advance

Submitted via IRC for TheMightyBuzzard

This new protection system works similar to the anti-child-porn detection systems in use at Facebook, and other social media giants like Google, Twitter, Instagram, and others.

It works on a database of file hashes, a cryptographic signature computed for each file.

Facebook says that once an abuser tries to upload an image marked as "revenge porn" in its database, its system will block the upload process. This will work for images shared on the main Facebook service, but also for images shared privately via Messenger, Facebook's IM app. Potential victims will need to upload nude photos of themselves

The weird thing is that in order to build a database of "revenge porn" file hashes, Facebook will rely on potential victims uploading a copy of the nude photo in advance.

This process involves the victim sending a copy of the nude photo to his own account, via Facebook Messenger. This implies uploading a copy of the nude photo on Facebook Messenger, the very same act the victim is trying to prevent.

The victim can then report the photo to Facebook, which will create a hash of the image that the social network will use to block further uploads of the same photo.

This is possible because in April this year, Facebook modified its image reporting process to take into account images showing "revenge porn" acts.

Facebook says it's not storing a copy of the photo, but only computing the file's hash and adding it to its database of revenge porn imagery.

Victims who fear that former or current partners may upload a nude photo online can pro-actively take this step to block the image from ever being uploaded on Facebook and shared among friends.

We won't be doing this. I don't even want to see hashes of you folks naked.

Source: https://www.bleepingcomputer.com/news/technology/facebook-to-fight-revenge-porn-by-letting-potential-victims-upload-nudes-in-advance/

Facebook asks Australians to send nude photos, for safety

"Worried that an ex-boyfriend or girlfriend might post your intimate photos on the internet? Facebook says it has a solution – as long as you'll hand over the photos first.

The social media giant recently announced its new plan to combat "revenge porn," when individuals post nude photos online without the consent of the subject." http://www.foxnews.com/tech/2017/11/08/facebook-says-it-needs-your-explicit-photos-to-combat-revenge-porn.html


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1) 2
  • (Score: 4, Funny) by DannyB on Friday November 10 2017, @04:54PM (27 children)

    by DannyB (5839) Subscriber Badge on Friday November 10 2017, @04:54PM (#595184) Journal

    I hope Facebook will treat this sensitive data with the care it deserves and ensure there are plenty of backups to prevent data loss.

    Who would volunteer to provide offsite storage of this data for Facebook?

    --
    When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
    • (Score: 2, Insightful) by Ethanol-fueled on Friday November 10 2017, @05:01PM (5 children)

      by Ethanol-fueled (2792) on Friday November 10 2017, @05:01PM (#595191) Homepage

      CIA blackmail material.

      • (Score: 1) by tftp on Friday November 10 2017, @09:08PM (4 children)

        by tftp (806) on Friday November 10 2017, @09:08PM (#595351) Homepage
        Why CIA onlly? FB has employees from all over the world, none have goverment clearance. FB is not subject to DSS audits and can run the machinery in any way they please. Given the wealth of data they obtained, FB is obviously an important target for intelligence services of all major countries. Blackmail is one of commonly used tools of recruiting agents.
        • (Score: 2, Interesting) by Ethanol-fueled on Friday November 10 2017, @09:39PM

          by Ethanol-fueled (2792) on Friday November 10 2017, @09:39PM (#595355) Homepage

          " FB has employees from all over the world, none have goverment clearance. "

          First of all, Facebook happily complies with the American Government's requests anyway.

          Besides that point, unless you have inside info, I have a very difficult time believing your latter point in the quote above, whether or not they are "our guys." If I recall correctly DSS audits apply only to facilities with closed areas. It is possible that Facebook has employees with clearance but no closed areas. Please correct me if I'm wrong about my latter point.

        • (Score: 2) by realDonaldTrump on Saturday November 11 2017, @12:01AM (2 children)

          by realDonaldTrump (6614) on Saturday November 11 2017, @12:01AM (#595411) Homepage Journal

          Very simply, Facebook was created by and for the CIA.

          • (Score: 2, Touché) by Ethanol-fueled on Saturday November 11 2017, @01:47AM

            by Ethanol-fueled (2792) on Saturday November 11 2017, @01:47AM (#595437) Homepage

            That doesn't sound like you, sir. Have they got to you too, treating you with scopolamine and Haiti Voodoo drug?

            Snap out of it! You've got a world to save!

          • (Score: 2) by DannyB on Monday November 13 2017, @04:13PM

            by DannyB (5839) Subscriber Badge on Monday November 13 2017, @04:13PM (#596247) Journal

            Very simply, Facebook was created by and for the CIA.

            Unlike Twitter, which was created by and for the good ol' KGB / FSB.

            --
            When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
    • (Score: 5, Insightful) by takyon on Friday November 10 2017, @05:05PM (20 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @05:05PM (#595195) Journal

      Look at my links below.

      Will they restrict uploads to 18-year and older uploaders? What about all the teen accounts that set their age as 90?

      Facebook has an existing "facial" recognition algorithm. It seems like they should be able to alter that to block nude photos automatically (the ones that are personally identifiable anyway).

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by DannyB on Friday November 10 2017, @05:15PM (3 children)

        by DannyB (5839) Subscriber Badge on Friday November 10 2017, @05:15PM (#595200) Journal

        If Facebook doesn't trust their algorithms to recognize revenge pr0n, then they probably don't trust them to identify that this person doesn't look 90 years old.

        As for automatically recognizing nude humans, Facebook will end up with quite a training data set if people actually do submit plenty of photos.

        At some point, it would become unnecessary to have humans vetting the photos to ensure they are actually potential revenge pr0n or blackmail material.

        --
        When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
        • (Score: 2) by takyon on Friday November 10 2017, @05:17PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @05:17PM (#595201) Journal

          At some point, it would become unnecessary to have humans vetting the photos to ensure they are actually potential revenge pr0n or blackmail material.

          Unnecessary, yet highly stimulating.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:24PM

          by Anonymous Coward on Friday November 10 2017, @06:24PM (#595251)

          The problem is that they still have to retain a copy of the materials in case there are court proceedings and the stuff involving minors is illegal to posses. So, they wouldn't be able to have the images in order to verify that the images are illegal and as a result, there's a huge hole that could be used to shut down just about anybody's account by processing images through this algorithm.

          And since things that get flagged as child porn tend to get automatically blocked until proven to not be, it could easily wind up being abused if FB hasn't though the process through.

        • (Score: 3, Insightful) by frojack on Friday November 10 2017, @07:08PM

          by frojack (1554) on Friday November 10 2017, @07:08PM (#595280) Journal

          Facebook will end up with quite a training data set if people actually do submit plenty of photos.

          With every single Facebook user busily adding photos of friends and family and casual acquaintances, complete with names and email addresses, they don't need nudes to help them.

          They've already got it figured out, and so does Apple, and Google. People just can't seem to help themselves from building skynet and joining the borg. Facebook conspired with Intel to build Intel's new Nirvana chipset [intelnervana.com] for this kind of thing. Instant facial reco and doxing anywhere anytime.

          People just can't seem to step away from their own camera, so I suspect Facebook will find a lot of takers for this offer.

          --
          No, you are mistaken. I've always had this sig.
      • (Score: 3, Touché) by DannyB on Friday November 10 2017, @05:18PM

        by DannyB (5839) Subscriber Badge on Friday November 10 2017, @05:18PM (#595202) Journal

        What about all the teen accounts that set their age as 90?

        The teens will be arrested and prosecuted for distribution of child pr0n.

        Facebook, the poor victim, will be asked to provide copies in triplicate, to police, as evidence.

        --
        When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
      • (Score: 3, Interesting) by JNCF on Friday November 10 2017, @05:23PM (12 children)

        by JNCF (4317) on Friday November 10 2017, @05:23PM (#595208) Journal

        I'm confused as to why they need to upload the image at all, they could have the hash computed in the user's browser and just upload that. With this scheme, there shouldn't be any reason to restrict minors from partaking. Maybe the images need to be reviewed by a human before hashing to ensure that the system isn't flooded with hashes of banal data that would stop, say, political memes from spreading? That filtering could also happen after a match is detected, but it would then at least delay the banal data from being posted, encouraging users to keep poisoning their well with phony selfies.

        • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @05:34PM (4 children)

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @05:34PM (#595215) Homepage Journal

          Yep. That's the downside of client-side computing. You can't trust the data processing. Done server-side you could just let anything with a fleshtones:non-fleshtones ratio above N through to the hashing function and feel relatively certain it's worth blocking that hash.

          --
          My rights don't end where your fear begins.
          • (Score: 2) by JNCF on Friday November 10 2017, @05:40PM (2 children)

            by JNCF (4317) on Friday November 10 2017, @05:40PM (#595218) Journal

            Monochrome revenge porn?

            • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @06:09PM (1 child)

              by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @06:09PM (#595239) Homepage Journal

              Yeah. It'd be much more difficult than text filters for certain. Low hanging fruit would be possible for a bit at least though.

              --
              My rights don't end where your fear begins.
              • (Score: 0) by Anonymous Coward on Friday November 10 2017, @10:24PM

                by Anonymous Coward on Friday November 10 2017, @10:24PM (#595378)

                but only for the old and the well-endowed.

          • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:28PM

            by Anonymous Coward on Friday November 10 2017, @06:28PM (#595257)

            It's not just that. There's also the issue of due diligence. If FB doesn't have a copy, the original could be anything at all. They also wouldn't have any way of training an AI to verify that the things being uploaded are likely to be illegal or harassment rather than just images that somebody is trolling with to get people locked out.

            I get the impetus for this, but the technical and legal challenges in doing this are probably a lot more complicated than it might seem.

        • (Score: 2) by frojack on Friday November 10 2017, @08:07PM (4 children)

          by frojack (1554) on Friday November 10 2017, @08:07PM (#595310) Journal

          why they need to upload the image at all, they could have the hash computed in the user's browser

          Doubt this would be effective.

          A simple dimension change, cropping, color shift, or photo-shop would outsmart the hash every time.

          Go to images.google.com and drag and drop any image you may happen to have on your computer or from any web site, and google will try to match it. Its pretty hopeless, and if the image is a face or body, and even if that exact image is readily available on the web.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 3, Interesting) by JNCF on Friday November 10 2017, @08:13PM

            by JNCF (4317) on Friday November 10 2017, @08:13PM (#595313) Journal

            This is an issue whether it's done on the client or the server, it's not relevant to the quote. An AC below suggests that they'll probably divide the image so subsets of it can be hashed. I could see this helping for corners (if some, but not all, edges were cropped), or for sections around something easily identifiable like a face, but it wouldn't help with resizing, color changing, compression, etc.

          • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:14AM (2 children)

            by Anonymous Coward on Saturday November 11 2017, @04:14AM (#595489)

            > A simple dimension change, cropping, color shift, or photo-shop would outsmart the hash every time.

            Play around with TinEye.com -- it does a surprisingly good job of matching images at different resolutions and different crops, even parody images that start with the same base and have something (mustache?) photoshopped in. Often by looking at the posting dates for the images it returns, it is possible to find out the origin of a particular image.

            While I don't use FB, if I did it would be handy to be able to send them hashes (or whatever) of images that I didn't want to see uploaded by anyone else. I'd even be willing to download their executable and generate the hashes locally before uploading. Or maybe the hashing could be built into some simple utility like EzThumbs.exe ?

            • (Score: 2) by JNCF on Saturday November 11 2017, @04:21PM (1 child)

              by JNCF (4317) on Saturday November 11 2017, @04:21PM (#595624) Journal

              But TinEye has both images to compare at the same time, not one image and the hash of another image. I don't see how their method would apply here, unless facebook actually does retain the uploaded image.

              • (Score: 2) by JNCF on Saturday November 11 2017, @05:00PM

                by JNCF (4317) on Saturday November 11 2017, @05:00PM (#595636) Journal

                On second thought, maybe facebook could be looking for features of an image (relations between curves, etc.) that I don't fully understand, hashing that mess as individual pieces, and comparing those hashes to the hashes of the the features gleaned from the potential revenge porn. I doubt TinEye reexamines its whole database everytime it searches for an image. You might be onto something.

        • (Score: 3, Interesting) by takyon on Friday November 10 2017, @10:29PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @10:29PM (#595381) Journal

          Here's what they could do without involving any user action. Use a nudity detection algorithm, which I'm sure they are already running:

          https://lizrush.gitbooks.io/algorithms-for-webdevs-ebook/content/chapters/nudity-detection.html [gitbooks.io]
          https://blog.algorithmia.com/improving-nudity-detection-nsfw-image-recognition/ [algorithmia.com]

          Then use facial recognition on the image. If the nude person(s) can be identified, prevent upload if they have not changed an opt-in setting to allow nudes of them to be posted.

          If a person uploads one of their own nudes, that is a good time for Facebook to throw a warning about the dangers of posting your tits online, or just confirming that you are uploading the correct image, and then have a button that says "Opt me in!"

          This could definitely cut down on some of the activity because Facebook has been using facial recognition to detect people in photos for years. I'm sure it has gotten more accurate since it debuted. Of course, once the system is in place, people can just upload the nude photos to another site. And at that point, it is no longer Facebook's problem.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by DannyB on Monday November 13 2017, @04:27PM

          by DannyB (5839) Subscriber Badge on Monday November 13 2017, @04:27PM (#596262) Journal

          why they need to upload the image at all, they could have the hash computed in the user's browser and just upload that.

          Let's use a hypothetical example.

          Let's suppose there is this person, Killary Flinton. Killary uploads lots of picture hashes to block these pictures from ever appearing on FaceTwit. Killary claims these are embarrassing pictures taken earlier in life when Killary wanted to prove having larger testicles than any other candidate. FaceTwit accepts Killary's explanation of why these pictures should never appear on FaceTwit.

          Now there is this other person Ronald Rump.

          It turns out that the picture hashes Killary uploaded are actually pictures of Ronald Rump that Killary want to block. Killary's diabolical plan to rule the world is that Killary can get elected by blocking all possible imagery of Rump.

          Therefore humans at facebook need to see the naked pictures:
          1. in order to prevent abuse
          2. looking at hashes is not as gratifying as looking at pictures

          Hope that helps. :-)

          --
          When trying to solve a problem don't ask who suffers from the problem, ask who profits from the problem.
      • (Score: 2) by JNCF on Friday November 10 2017, @05:25PM

        by JNCF (4317) on Friday November 10 2017, @05:25PM (#595210) Journal

        Reading your link below, yes, they review beforehand. I guess no children, then.

      • (Score: 1, Insightful) by Anonymous Coward on Friday November 10 2017, @05:55PM

        by Anonymous Coward on Friday November 10 2017, @05:55PM (#595228)

        They can use this technology to censor any person a government or corporation doesn't want to exist.

        Any photo you upload to try and bring attention to the person is just poof gone.

        A few years and they will be able to simply insert an alternative person or generic standin as needed to replace someone.

        Technological censorship and revisionism is about to be taken to a whole other level.

  • (Score: 4, Informative) by takyon on Friday November 10 2017, @04:55PM (8 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @04:55PM (#595185) Journal

    Facebook explains how it’ll review nude photos to stop revenge porn - Select company employees will indeed look at your nudes [theverge.com]

    Actually, Facebook Will Not Blur the Nudes Sent to Its Anti-Revenge Porn Program [vice.com]

    I give it up to another 24 hours (Saturday dead news cycle) before they apologize or quietly cancel the program.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 5, Funny) by Ethanol-fueled on Friday November 10 2017, @04:57PM (2 children)

      by Ethanol-fueled (2792) on Friday November 10 2017, @04:57PM (#595187) Homepage

      I tried to upload a nude of myself but it was rejected because the file size was too large.

      • (Score: 5, Funny) by The Mighty Buzzard on Friday November 10 2017, @05:25PM

        by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @05:25PM (#595209) Homepage Journal

        I tried telling you that that much alcohol would fuck up your calorie count but did you listen?

        --
        My rights don't end where your fear begins.
      • (Score: 0) by Anonymous Coward on Friday November 10 2017, @08:00PM

        by Anonymous Coward on Friday November 10 2017, @08:00PM (#595306)

        Are you sure it just didn't get rejected by the fugly filter?

    • (Score: 2) by takyon on Friday November 10 2017, @05:05PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday November 10 2017, @05:05PM (#595197) Journal
    • (Score: 2) by JoeMerchant on Friday November 10 2017, @07:38PM (3 children)

      by JoeMerchant (3937) on Friday November 10 2017, @07:38PM (#595295)

      So, if anyone at Facebook knew how to make a non-web based app, and any Facebook users were sophisticated enough to understand how to be relatively sure that the app is restricted from uploading by itself, they could make an app that would transform a photo into a hash and allow the user to copy-paste the hash into a web-enabled application that would get it into the Facebook database: to make this work no nudes need navigate network nodes.

      On the other hand, how hard do you think it would be to fool the hash algorithm by editing said photo, say with a little unflattering spatial distortion?

      --
      🌻🌻 [google.com]
      • (Score: 3, Funny) by GreatAuntAnesthesia on Friday November 10 2017, @08:41PM (1 child)

        by GreatAuntAnesthesia (3275) on Friday November 10 2017, @08:41PM (#595334) Journal

        mod +1 alliteration.

        • (Score: 2) by Gaaark on Saturday November 11 2017, @12:50AM

          by Gaaark (41) on Saturday November 11 2017, @12:50AM (#595420) Journal

          Nice catch!
          :)

          --
          --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
      • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:18AM

        by Anonymous Coward on Saturday November 11 2017, @04:18AM (#595491)

        See my post in another thread about the clever photo matching site, TinEye.com It can find similar images, even with cropping and photoshopping.

  • (Score: 5, Insightful) by tangomargarine on Friday November 10 2017, @05:00PM (25 children)

    by tangomargarine (667) on Friday November 10 2017, @05:00PM (#595190)

    scheme is moronic

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 3, Insightful) by Runaway1956 on Friday November 10 2017, @05:22PM (23 children)

      by Runaway1956 (2926) Subscriber Badge on Friday November 10 2017, @05:22PM (#595206) Journal

      Bingo! Tangomargarine wins all the internets for a month.

      The real problem is, there are millions of users who actually think that Facefook really is trustworthy.

      And, if Facefook were trustworthy, WTF makes anyone think that US half-intelligence agencies are trustworthy? If NSA wants a nekkid picture of you for blackmail purposes, make them send someone out to take the picture. Don't give it to them!!

      • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @05:28PM (22 children)

        by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @05:28PM (#595212) Homepage Journal

        Or just post them yourself. It's difficult to blackmail someone with public knowledge.

        --
        My rights don't end where your fear begins.
        • (Score: 2) by Runaway1956 on Friday November 10 2017, @05:42PM (4 children)

          by Runaway1956 (2926) Subscriber Badge on Friday November 10 2017, @05:42PM (#595219) Journal

          There is that. The most valid reason that the military had for banning homosexuals, was that they could be blackmailed. Let the homos make the fact that they are homosexual public knowledge, suddenly, they can't be blackmailed for being homos.

          On the other hand, ugly people probably don't want to post their ugly asses for the world to see.

          • (Score: 3, Funny) by The Mighty Buzzard on Friday November 10 2017, @06:08PM

            by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @06:08PM (#595237) Homepage Journal

            Ron. Jeremy.

            --
            My rights don't end where your fear begins.
          • (Score: 2, Funny) by Ethanol-fueled on Friday November 10 2017, @06:09PM (2 children)

            by Ethanol-fueled (2792) on Friday November 10 2017, @06:09PM (#595238) Homepage

            I disagree. I do agree that blackmail the most valid reason the military (and other agencies) had for disqualifying gays from a security clearance, but the more obvious reason from the military perspective is that they didn't want buttsex and dyking out all over the place. It's difficult to keep discipline when you're surrounded by naked people you're attracted to.

            Funny that, as some armies of antiquity encouraged "brotherly love."

            • (Score: 0) by Anonymous Coward on Friday November 10 2017, @10:12PM (1 child)

              by Anonymous Coward on Friday November 10 2017, @10:12PM (#595374)

              ...prejudice dies hard, and the military—and its socially conservative supporters—were not about to let the inconvenient truth get in the way of their bias. They came up with a series of rationales for discrimination, each of which eventually fell: gay people were dubbed a security risk; then criminal; then mentally ill; then a threat to the family; then weak warriors; then a source of discomfort for the fragile egos of straight troops; then a medical risk in the time of AIDS; then tramplers of privacy. Finally, champions of military tradition devised the argument that gay people in uniform were a threat to unit cohesion and military readiness...

              (Slate [slate.com])

              I'm not a soldier, but I would imagine that "brotherly love" could actually promote "unit cohesion."

              • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @01:25PM

                by Anonymous Coward on Saturday November 11 2017, @01:25PM (#595571)

                Up until the first jealous fight because so-and-so fucked so-and-so, and so-and-so is mine!

        • (Score: 2) by JNCF on Friday November 10 2017, @05:44PM (16 children)

          by JNCF (4317) on Friday November 10 2017, @05:44PM (#595221) Journal

          But if you're the sort of person who could be blackmailed with nude images, you wouldn't want to do that. I personally wouldn't care, but some people do. This doesn't seem like a helpful solution for those people.

          • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @06:02PM (15 children)

            by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @06:02PM (#595233) Homepage Journal

            Yeah, my views on that situation are much like my views on abortion though. Don't have unprotected sex if you don't want a kid and don't take pictures of it if you don't want them on the Internet. When exactly did it become hate speech to advise someone to take some responsibility for their actions and not be a complete idiot?

            --
            My rights don't end where your fear begins.
            • (Score: 2) by JNCF on Friday November 10 2017, @06:51PM (12 children)

              by JNCF (4317) on Friday November 10 2017, @06:51PM (#595272) Journal

              Don't smoke if you don't want cancer, chemotherapy should be illegal. #MAGA

              • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @08:50PM (11 children)

                by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @08:50PM (#595339) Homepage Journal

                Nah, insurance should. Pay for it and you can have anything you want guilt-free.

                --
                My rights don't end where your fear begins.
                • (Score: 3, Interesting) by JNCF on Friday November 10 2017, @08:59PM (10 children)

                  by JNCF (4317) on Friday November 10 2017, @08:59PM (#595344) Journal

                  Anything? Even abortions? Even... insurance?

                  • (Score: 3, Informative) by The Mighty Buzzard on Friday November 10 2017, @10:34PM (9 children)

                    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @10:34PM (#595382) Homepage Journal

                    You don't pay for insurance. Insurance is a bet. You're betting that you're going to be sicker than most of the other people using the same bookie.

                    --
                    My rights don't end where your fear begins.
                    • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @06:39AM (3 children)

                      by Anonymous Coward on Saturday November 11 2017, @06:39AM (#595516)

                      That's not how health insurance works. The rates the company pays are lower than what you'd pay off you wanted to pay out of pocket because they were negotiated ahead of time.

                      You may get "lucky" and need a multi-million dollar surgery, but it doesn't take much health care to wind up saving money via insurance.

                      • (Score: 2) by The Mighty Buzzard on Saturday November 11 2017, @10:48AM (2 children)

                        by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Saturday November 11 2017, @10:48AM (#595550) Homepage Journal

                        Wow, you haven't clue one, do you? Insurance companies pay quite a lot more than you pay if you say at the outset you'll be paying cash.

                        --
                        My rights don't end where your fear begins.
                        • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:35PM (1 child)

                          by Anonymous Coward on Saturday November 11 2017, @04:35PM (#595627)

                          Only because they don't perform anything beyond the bare minimum knowing they may be on the hook for it. Insurance companies at least have predictable patterns about what will and won't be covered.

                          • (Score: 2) by The Mighty Buzzard on Saturday November 11 2017, @04:57PM

                            by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Saturday November 11 2017, @04:57PM (#595635) Homepage Journal

                            Wrong. Do some actual research. The exact same hospital stay with the exact same line items will cost you significantly more paying with insurance than with cash. That's not even taking into account the cash-only medical facilities that've sprung up since Obamacare which charge even less than the cash price at anywhere that takes insurance.

                            --
                            My rights don't end where your fear begins.
                    • (Score: 2) by JNCF on Saturday November 11 2017, @03:31PM (4 children)

                      by JNCF (4317) on Saturday November 11 2017, @03:31PM (#595602) Journal

                      I agree with your skepticism of insurance, but not your definition of paying. I suppose I can't pay for lottery tickets, stocks, bonds, gold coins, bitcoins, or tulips, either? Does big daddy government need to protect me from all bets, or is this just an ad hoc rule you came up with because you don't like insurance?

                      • (Score: 2) by The Mighty Buzzard on Saturday November 11 2017, @03:56PM (3 children)

                        by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Saturday November 11 2017, @03:56PM (#595612) Homepage Journal

                        Oh you're paying when you place a bet but you're not paying for a lottery win, a stock price increase, healthcare, etc... You're paying for the chance that you may come out on top, though probably not.

                        --
                        My rights don't end where your fear begins.
                        • (Score: 2) by JNCF on Saturday November 11 2017, @04:45PM (2 children)

                          by JNCF (4317) on Saturday November 11 2017, @04:45PM (#595632) Journal

                          And that should be illegal, as per your comment before last? You can turn in your libertarian card at the nearest gun store, Flighty Uzzard.

                          • (Score: 2) by The Mighty Buzzard on Saturday November 11 2017, @05:03PM (1 child)

                            by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Saturday November 11 2017, @05:03PM (#595638) Homepage Journal

                            Nah, that was just me goading you over your parent comment. You know, saying something as absurd as what I was replying to...

                            --
                            My rights don't end where your fear begins.
                            • (Score: 2) by JNCF on Saturday November 11 2017, @05:23PM

                              by JNCF (4317) on Saturday November 11 2017, @05:23PM (#595643) Journal

                              And here, that's what I thought I was doing...

            • (Score: 0) by Anonymous Coward on Friday November 10 2017, @09:02PM (1 child)

              by Anonymous Coward on Friday November 10 2017, @09:02PM (#595345)

              Birth control is not flawless. It's better to not have sex at all if you don't want a kid. Also, hope you don't get raped.

              But at least abortion is legal so the problem can be taken care of if desired.

              • (Score: 3, Informative) by The Mighty Buzzard on Friday November 10 2017, @10:40PM

                by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @10:40PM (#595385) Homepage Journal

                Rape, incest, and responsibly using birth control but it still failing are extremely rare exceptions that I'll be happy to address separately. For the overwhelming majority of abortion cases though, it's being used instead of a condom. Example: In 2013 29,007 abortions were performed for black women in NYC while only 24,108 black women gave birth.

                --
                My rights don't end where your fear begins.
    • (Score: 5, Informative) by realDonaldTrump on Friday November 10 2017, @07:34PM

      by realDonaldTrump (6614) on Friday November 10 2017, @07:34PM (#595292) Homepage Journal

      Nobody ever looks at the Original Submission. But I put in a very telling quote from Mark Zuckerberg. He said "they trust me -- dumb fucks." Let me tell you, people who trust him are very foolish. Very foolish! He wants to be your president. Don't do it! He's a smart cookie. But a very bad or sick guy. There is something bad or sick going on with him. #TRUMP2020

  • (Score: 4, Insightful) by Fnord666 on Friday November 10 2017, @05:20PM

    by Fnord666 (652) on Friday November 10 2017, @05:20PM (#595204) Homepage
    Remember to save revenge porn photos in a new file format before uploading.
  • (Score: 5, Insightful) by bradley13 on Friday November 10 2017, @05:27PM (6 children)

    by bradley13 (3053) on Friday November 10 2017, @05:27PM (#595211) Homepage Journal

    There are so many problems with this, you really have to wonder: which PHB actually approved it? Let's see:

    - You have no idea which photos your ex might upload, so you have to upload lots more just to be sure.

    - If you took video, are you supposed to upload each frame individually?

    - Anyway, you don't have the photos or videos that your ex took on their phone, so it's not going to help anyway.

    - Meanwhile, some random strangers are looking at your pics.

    - If you're attractive, there is a non-zero chance that your pics will be copied, shared, and even uploaded to a porn site.

    On the positive side, this may help a few people realize just how totally creepy Facebook really is...

    --
    Everyone is somebody else's weirdo.
    • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @05:30PM (1 child)

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @05:30PM (#595213) Homepage Journal

      Sounds more like a PFY scheme to me. Get all your users to send you nudie pics which you can then blackmail them with.

      --
      My rights don't end where your fear begins.
      • (Score: 3, Funny) by nobu_the_bard on Friday November 10 2017, @06:14PM

        by nobu_the_bard (6373) on Friday November 10 2017, @06:14PM (#595245)

        6 months later...

        "Beginning tomorrow morning, only paid Facebook accounts may keep their content private, and will have the ability to delete uploads, in order to best serve our customers..."

    • (Score: 4, Interesting) by halcyon1234 on Friday November 10 2017, @06:41PM (1 child)

      by halcyon1234 (1082) on Friday November 10 2017, @06:41PM (#595267)

      Another issue:

      There is now a brand new vector for phishing. From "Faecbook Securty" "YOUR NUDES HAVE BEEN POSTED! Plz send us teh nudes so we remove them from internet and anyone who see them, immediately, thank you."

      --
      Original Submission [thedailywtf.com]
      • (Score: 5, Interesting) by nobu_the_bard on Friday November 10 2017, @06:52PM

        by nobu_the_bard (6373) on Friday November 10 2017, @06:52PM (#595273)

        As someone who maintains various mail systems, they are already doing this.

    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @09:45PM (1 child)

      by Anonymous Coward on Friday November 10 2017, @09:45PM (#595359)

      And, if ex applies a filter or, say, changes Green value in pixel 523097 from 180 to 179, the hash should change.

      Hard to believe Facebook engineers are really this naive? Maybe it's a social experiment, just to demo how gullible people are?

      • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:24AM

        by Anonymous Coward on Saturday November 11 2017, @04:24AM (#595494)

        > changes Green value in pixel 523097 from 180 to 179, the hash should change.

        See earlier post about TinEye.com it matches images with a number of "defects" like this, works with different cropping and with photoshopped mods.

  • (Score: 4, Insightful) by Anonymous Coward on Friday November 10 2017, @05:44PM (2 children)

    by Anonymous Coward on Friday November 10 2017, @05:44PM (#595222)

    - How did you get all those nude photos?
    - They uploaded them!
    - Why?
    - They "trust me". Dumb fucks.

    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @05:58PM (1 child)

      by Anonymous Coward on Friday November 10 2017, @05:58PM (#595230)

      There it is. Have another mod point!

      • (Score: 0) by Anonymous Coward on Friday November 10 2017, @09:46PM

        by Anonymous Coward on Friday November 10 2017, @09:46PM (#595361)

        its what I thought, too

        how come someone hasn't gotten him fired yet. i guess maybe that since he is mr facebook, he knows better than to let secrets get online. still the guy needs a good slapping because he's evil in a way that simply goes beyond what everyone expected bill gates to be.

        windows is just software. facebook has ensared so much more that MS had to redo windows to try to catch up because they couldn't get the dumb fucks to willingly hand over all of their secrets...

  • (Score: 5, Insightful) by Thexalon on Friday November 10 2017, @05:44PM (7 children)

    by Thexalon (636) on Friday November 10 2017, @05:44PM (#595223)

    What we need is a societal change that recognizes two things:
    1. Everybody is naked sometimes. Everybody has genitalia of some kind, and there aren't that many varieties really.
    2. The vast majority of people have sex at some point in their life.

    Until evidence that people are sometimes naked and sometimes have sex becomes no longer scandalous, those trying to embarrass others with that stuff will be able to do so.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @05:54PM

      by Anonymous Coward on Friday November 10 2017, @05:54PM (#595227)

      Lead by example. Inspect kids' genitals.

    • (Score: 2) by The Mighty Buzzard on Friday November 10 2017, @06:07PM (3 children)

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Friday November 10 2017, @06:07PM (#595236) Homepage Journal

      Nah, revenge porn will just move to threatening to show you pictures of your grandparents getting their freak on when that happens.

      --
      My rights don't end where your fear begins.
      • (Score: 1) by Ethanol-fueled on Friday November 10 2017, @06:12PM (2 children)

        by Ethanol-fueled (2792) on Friday November 10 2017, @06:12PM (#595242) Homepage

        You never know who will go onto politics or express anti-government sentiment when they get older.

        • (Score: 2) by Thexalon on Friday November 10 2017, @08:17PM (1 child)

          by Thexalon (636) on Friday November 10 2017, @08:17PM (#595317)

          Which again, will have no effect if people simply accept that politicians also get it on sometimes. I mean, nobody really wants to picture Angela Merkel doing it, but I'm reasonably certain she has and probably still does.

          --
          The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:25PM

      by Anonymous Coward on Friday November 10 2017, @06:25PM (#595252)

      The religious people will never come to terms with this. After all, the guy who created the entire universe is interested in nothing like what joe and jane do in the privacy their bedroom.

    • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:31AM

      by Anonymous Coward on Saturday November 11 2017, @04:31AM (#595495)

      > What we need is a societal change ...

      Been there, done that (on a small scale). Lived in a communal space with a dozen (give or take) others c.1980. Clothing optional, without A/C in the summer we were mostly nude. I was very bashful as a kid, but after a few days there with friends, the inhibitions went away and bodies were just bodies. No big deal.

      The place lasted for about 10 years, until the building was renovated and all the tenants (including normal businesses on the lower floors) were kicked out. In all that time I only remember a couple of people that didn't adapt quickly.

  • (Score: 2) by looorg on Friday November 10 2017, @06:05PM (9 children)

    by looorg (578) on Friday November 10 2017, @06:05PM (#595234)

    Facebook says it's not storing a copy of the photo, but only computing the file's hash and adding it to its database of revenge porn imagery.

    So they are hashing streams now? Totally not storing the image even temporarily and then deleting it right after they are done. Cause deleting content is what Facebook is know for ... If sharing all your filthy amateur p0rn with Facebook to protect you from Revenge p0rn from all the creepy dudes (or ladies) you have been banging doesn't sound like a trap I don't know what ever will.

    Victims who fear that former or current partners may upload a nude photo online can pro-actively take this step

    Perhaps they should just integrate that into the phones photo-app. "Hi! Clippy-the-Pimp has detected a lot of fleshtones in your picture and my AI sub-processor has deemed the content lewd. A copy has been automagically CC:ed to the Facebook cloud-anti-revenge-p0rn-hash-library to keep you safe!"

    • (Score: 2) by Runaway1956 on Friday November 10 2017, @06:22PM (6 children)

      by Runaway1956 (2926) Subscriber Badge on Friday November 10 2017, @06:22PM (#595250) Journal

      "only computing the file's hash"

      So, how much does a photo need to change, before it generates a different hash? A little cropping, for a closeup? Do a redeye filter on it? Change a few pixels in a corner? Hide an encrypted file in it? Maybe just strip the metadata?

      I wonder how much their little scheme relies on metadata, itself? "We've got a match here John!! Same time, date, and GPS coordinates as these six photos!!" (oversimplified, of course, but the algorithms may well be using that metadata)

      • (Score: 1, Interesting) by Anonymous Coward on Friday November 10 2017, @06:34PM (2 children)

        by Anonymous Coward on Friday November 10 2017, @06:34PM (#595262)

        Presumably, it's not just one hash, it's a half-dozen or more done with varying degrees of filtering. If you divide an image up into 8 pieces, it's hard to change the original so that it doesn't have the same relationship between the samples. The downside is that if it's only 8 there are going to be a ton of images that match. You can increase that number to something larger like 128 or more, in which case it'll be more susceptible to this kind of manipulation. You can also compare individual samples between the two.

        There's other things you can do like taking a high pass filter of the two images and comparing those checksums together. It's fairly hard to prevent those from not coming out similarly without massively altering the original.

        And ultimately, you don't necessarily have to make it a 100% match, you just having false positives that somebody has to manually review is not necessarily a problem if the risk is relatively unusual and the image remains in place pending review.

        You're unlikely to ever completely solve the problem without manually comparing images, but just making it less convenient to post will likely cut down on the sharing that goes on.

        • (Score: 1) by khallow on Friday November 10 2017, @08:17PM (1 child)

          by khallow (3766) Subscriber Badge on Friday November 10 2017, @08:17PM (#595316) Journal
          The point of a hash is that close is not good enough. With the scheme described in this thread, it sounds like you can recreate the picture by playing a game of warmer/colder (the nudie will be the near unique result with nice gradients).
          • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @06:34AM

            by Anonymous Coward on Saturday November 11 2017, @06:34AM (#595515)

            And having to have a new hash to cover every single pixel change on every image on FB would easily overwhelm the storage capacity of every storage device on the planet.

            Humans can identify an image as being the same despite minor changes a traditional hash isn't going to handle that very well without something being done to normalize the data. Chunking and blurring removes a lot off minor variation without using the processing power of more in depth analysis.

            The point is that rather than comparing an image against all of the images you can compare against just the ones sharing the abbreviated match. Then you can do something more resource intensive to verify or rule out the match.

            Just comparing checksums would be a waste of resources with an unforgivable number of false negatives.

      • (Score: 2) by looorg on Friday November 10 2017, @06:49PM

        by looorg (578) on Friday November 10 2017, @06:49PM (#595270)

        Since they don't really, from what I have seen so far, go into any great depth about how this revenge-p0rn protection scheme will work it's quite hard to know. Will they even store metadata? Will there just be one hash per image or will they run a few different once? Considering the amount of people on earth that apparently also like to take pictures of themselves showing tits, dicks and all things between there should be a fairly large amount of these created on a daily basis. It shouldn't take to much time until you run into one of them hashing paradoxes and there is a collision and you have two or more images that generate the same hash (two files one hash ...). That might not matter all that much tho in this case since they would just be blocking a few more p0rn images then they intended. There might be an issue if they want to notify the original uploader that they have been shared and the system automatically messages two different people to tell them about it.

      • (Score: 4, Funny) by JNCF on Friday November 10 2017, @07:07PM

        by JNCF (4317) on Friday November 10 2017, @07:07PM (#595276) Journal

        So, how much does a photo need to change, before it generates a different hash?

        Just a bit.

      • (Score: 0) by Anonymous Coward on Saturday November 11 2017, @04:35AM

        by Anonymous Coward on Saturday November 11 2017, @04:35AM (#595496)

        See comments elsewhere about TinEye.com -- it can match images with all kinds of changes, including parodies with photoshopping. I'm assuming FB will do something like this, maybe just buy the TinEye technology, lord knows they could afford it.

    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:29PM (1 child)

      by Anonymous Coward on Friday November 10 2017, @06:29PM (#595259)

      Facebook says it's not storing a copy of the photo, but only computing the file's hash...

      It depends entirely on the article you read. The first one I read said this. A later one confirmed that they retain the photos, but blur them, and share them with select people. A later one said that they won't blur them at all and staff will definitely be looking at everything.

      Bottom line, Facebook wants to become a hub of amateur pr0n the same way they've become the envy of the NSA for personal information.

      • (Score: 2) by HiThere on Friday November 10 2017, @07:08PM

        by HiThere (866) Subscriber Badge on Friday November 10 2017, @07:08PM (#595279) Journal

        Maybe they're planning on using them as input to their new AI image generator...everything guaranteed artificial, but lifelike.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 4, Insightful) by Grishnakh on Friday November 10 2017, @06:10PM (5 children)

    by Grishnakh (2831) on Friday November 10 2017, @06:10PM (#595240)

    They want you to upload your nude pictures, so they can create hashes of the pictures and use these for comparing to photos your associates may upload? They don't see the potential for abuse here? Worse, how does it work with under-age people? It's not like they're immune to this, but this creates a giant liability for FB if they're storing this material.

    Here's a better idea: they give you a program you run on *your* computer, and you use that with your nude photos to calculate the hashes. Then you send those hashes to Facebook. This way, you know that the only thing FB has is just a bunch of hashes, not your actual nude photos. The main problem with this is verifying that FB's program actually does only what it claims, and doesn't upload the photos themselves to FB, but independent researchers can easily verify that with Wireshark (plus, if you have a bunch of photos, or even just a few videos, the upload time for those is orders of magnitude greater than some hashes, and should be immediately noticeable).

    I guess this idea is a non-starter because no one (esp. Facebook users) wants to run any native application software any more, and wants everything to be done in the web browser. But even here, this stuff could be done client-side in Javascript.

    • (Score: 1) by Ethanol-fueled on Friday November 10 2017, @06:12PM (2 children)

      by Ethanol-fueled (2792) on Friday November 10 2017, @06:12PM (#595243) Homepage

      How about just not using Facebook? Oh, right, how could I forget, people are vain idiots.

      • (Score: 0) by Anonymous Coward on Friday November 10 2017, @07:27PM (1 child)

        by Anonymous Coward on Friday November 10 2017, @07:27PM (#595290)

        The problem with that is that by definition revenge porn is posted by somebody other than the target. So, if the target doesn't have a FB account, it just means fewer "friends" see it. But strangers might see it and most people have friends that are on FB, so those people can see as well.

        • (Score: 2, Interesting) by Ethanol-fueled on Friday November 10 2017, @09:53PM

          by Ethanol-fueled (2792) on Friday November 10 2017, @09:53PM (#595364) Homepage

          If I suspected that I were a victim of revenge porn, I'd browse /b/ before I'd browse Facebook.

          Also, I have an old story I posted on Slashdot and may have posted here. Pardon if I already posted it here. Anyway, when I was a freshman in high school, a disgruntled ex-boyfriend of a 16 year old student went through the trouble to enlarge a photo of his ex girlfriend topless (including face) and run off a shit-ton of copies of it, then at night spread those "leaflets" all over the campus. They were everywhere - tucked in tree branches, hidden under trash cans, stuffed into mailboxes, and generally all over the ground everywhere. The administration and other staff were still finding and picking up copies well into the afternoon after an all-day effort.

          No cops were called, there were no mentions of "child porn." This was back in the day when anything with a set of tits wasn't considered "child porn."

    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:17PM

      by Anonymous Coward on Friday November 10 2017, @06:17PM (#595248)

      Your mention of abuse made me think of something: see a meme you don't like? Label it "revenge porn".

      I guess that is you you are supposed to submit the files ahead of time.

    • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:27PM

      by Anonymous Coward on Friday November 10 2017, @06:27PM (#595254)

      But then how would Zuckerberg get copies?

  • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:14PM (2 children)

    by Anonymous Coward on Friday November 10 2017, @06:14PM (#595244)

    If they were serious about it, they would let people compute the hashes on their own machine, without uploading the original file to facebook.

    I suppose it would be slightly easier to avoid the filter in that case, but only slightly.

    • (Score: 2) by Jiro on Friday November 10 2017, @07:07PM (1 child)

      by Jiro (3176) on Friday November 10 2017, @07:07PM (#595277)

      They can't let you do it on your own computer. If they did, people could maliciously report someone's non-porn image as porn.

      • (Score: 0) by Anonymous Coward on Friday November 10 2017, @07:54PM

        by Anonymous Coward on Friday November 10 2017, @07:54PM (#595303)

        If they calculate the hashes on the server, that can't happen?

(1) 2