Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by martyb on Friday June 28 2019, @01:14PM   Printer-friendly
from the I-spy-with-my-little-eye dept.

Katyanna Quach over at El Reg is reporting on the removal of the DeepNude Web and desktop apps from the developers' website. DeepNude is an application that takes photos of clothed women (apparently, the app does not function properly with photos of males -- there's a shocker!), digitally removes clothing and adds realistic looking naughty bits.

From the article:

A machine-learning-powered perv super-tool that automagically removed clothes from women in photos to make them appear naked has been torn offline by its makers.

The shamefaced creators of the $50 Windows and Linux desktop app DeepNude claimed they were overwhelmed by demand from internet creeps: the developers' servers apparently buckled under a stampede of downloads, their buggy software generated more crash reports than they could deal with, and this all came amid a firestorm of social media outrage.

[...] Basement dwellers and trolls could feed it snaps of celebrities, colleagues, ex-girlfriends, and anyone else who takes their fancy, and have the software guess, somewhat badly, what they look like underneath their clothes, keeping their faces intact. These bogus nudes are perfect for distributing around the 'net to humiliate victims.

There was so much interest in this misogynistic piece of crap that the site's servers couldn't handle the traffic and crashed, it is claimed. The team initially said on Thursday they were trying to fix bugs, and expected everything to be up and running again in a few days:

Hi! DeepNude is offline. Why? Because we did not expect these visits and our servers need reinforcement. We are a small team. We need to fix some bugs and catch our breath. We are working to make DeepNude stable and working. We will be back online soon in a few days.
        — deepnudeapp (@deepnudeapp) June 27, 2019

Shortly after that message, they changed their tune. Instead of trying to bring it back online, the developers decided to pull the plug on deepnude.com completely, kill off distribution of the code, and hope the scandal just goes away.

"The world is not yet ready for DeepNude," the team, based in Estonia, said on Thursday. Or rather, quite likely, the team wasn't ready for all the criticism and rage lobbed its way on Twitter, Facebook, and elsewhere, as a result of its work.

It's unsurprising that an application with this big a potential for abuse would cause such outrage. Of course, it's not really gone, as it's still available from various torrent sites.

So what say you? Obviously, the genie can't be put back in the bottle, so the (as the author of TFA put it) "Basement dwellers and trolls" will be creating naked pics of, well, everyone, for a long time to come.

Of course DeepFake video can have your exes and your friends' moms/daughters/grandmothers engaging in hardcore porn, but those techniques aren't (yet) available to the masses. This app, however, can be used by just about anyone *right now*.

What will this do to the quality of still image soft-core porn? Will the courts get involved? How should this be dealt with (if at all)?

Could widespread use of tools like this (and there will be more, of course), finally change how the hoi-polloi protect their digital images?

Bonus question: Whose photo(s) will *you* run through this software?

Other coverage:
https://www.theverge.com/2019/6/27/18761496/deepnude-shuts-down-deepfake-nude-ai-app-women
https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman
https://www.vice.com/en_us/article/qv7agw/deepnude-app-that-undresses-photos-of-women-takes-it-offline
https://www.dailydot.com/debug/deepnude-app-pulled-offline/


Original Submission

Related Stories

Politics: Veritas Claims Leaked Internal E-Mails from Google Showing Political Bias of Results 100 comments

[Editor's note: This story has an interesting viewpoint given the proliferation of "Deep Fake" videos we recently covered here. I see it as a portent of discussions to come. How much can we trust reporting? How much slanting and posturing of "reports" and "studies" are going to be promulgated in the lead-up to the next presidential election? Is this item all a bunch of crap or an indication of things we can expect to come? How much can we trust, and how to we go about assessing the veracity of what is presented to us by not only the main-stream media, but also social media, too? We hereby disclaim any assurance as to the credibility of the accusations made here and present it solely as an example of what may be coming -- and an opportunity to practice techniques at validating/corroborating or challenging/refuting it. The story submission appears after the break.]

GitHub Censors "Sexually Obscene" DeepNude Code 102 comments

Github is banning copies of 'deepfakes' porn app DeepNude

GitHub is banning code from DeepNude, the app that used AI to create fake nude pictures of women. Motherboard, which first reported on DeepNude last month, confirmed that the Microsoft-owned software development platform won't allow DeepNude projects. GitHub told Motherboard that the code violated its rules against "sexually obscene content," and it's removed multiple repositories, including one that was officially run by DeepNude's creator.

DeepNude was originally a paid app that created nonconsensual nude pictures of women using technology similar to AI "deepfakes." The development team shut it down after Motherboard's report, saying that "the probability that people will misuse it is too high." However, as we noted last week, copies of the app were still accessible online — including on GitHub.

Late that week, the DeepNude team followed suit by uploading the core algorithm (but not the actual app interface) to the platform. "The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code," wrote the team on a now-deleted page. "DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects."

Also at The Register, Vice, and Fossbytes.

Previously: "Deep Nude" App Removed By Developers After Brouhaha

Related: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"
My Struggle With Deepfakes
Deep Fakes Advance to Only Needing a Single Two Dimensional Photograph


Original Submission

Software Under Development to Detect and Delete Unsolicited Nude Images 65 comments

Unsolicited nudes detected and deleted by AI

Software that can detect and delete unsolicited penis pictures sent via private messages on Twitter is being developed by researchers in Seattle. The project was started after developer Kelsey Bressler was sent an unsolicited nude photo by a man. She is now helping a friend refine an artificial intelligence system that can detect the unwanted penis pictures and delete them before they are ever seen.

She said social networks could do more to protect users from cyber-flashing. "When you receive a photo unsolicited it feels disrespectful and violating," Ms Bressler told the BBC. "It's the virtual equivalent of flashing someone in the street. You're not giving them a chance to consent, you are forcing the image on them, and that is never OK."

To test and train the artificial intelligence system, Ms Bressler and her team set up a Twitter inbox where men were invited to "send nudes for science". So many volunteered their nude photos that the team has had to close the inbox.

Related: "Deep Nude" App Removed By Developers After Brouhaha
GitHub Censors "Sexually Obscene" DeepNude Code


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2, Offtopic) by Runaway1956 on Friday June 28 2019, @02:17PM (3 children)

    by Runaway1956 (2926) Subscriber Badge on Friday June 28 2019, @02:17PM (#860925) Journal

    TFS mentions torrents, so I looked. Five different torrents, with well over 1000 seeders among them. Nothing surprising there. But, 1.92 GB? Holy crap, that's a big download for an executable. Must be stacked high with libraries and crap. It will probably require a not-insignificant portion of my resources, and I have a lot of cores and memory to work with. And, TFS mentions, specifically, that the app does a rather poor job.

    I'm going to download it, and run it in a virtual machine, just to see how bad it can be. I'm curious how it works under the hood - but I've gotten lazy, and may not try to decompile it. It's Windows anyway, which is a pain in the ass when trying to look behind the scenes. It would be good if someone good at this kind of thing popped the hood for us, and showed us how and why it's a POS. Would it be surprising if they hard coded into the app the xx best looking Playboy Playmates of all time? So, every woman you "undress" looks like a centerfold.

    • (Score: 4, Interesting) by NotSanguine on Friday June 28 2019, @02:31PM (2 children)

      TFS mentions torrents, so I looked. Five different torrents, with well over 1000 seeders among them. Nothing surprising there. But, 1.92 GB?

      I'd guess that a lot of that is the machine learning database.

      Interestingly, there is, supposedly, a Linux version, but I didn't see anything but the Windows versions on any of the torrent sites.

      AFAIK, this code is *not* open source either. Does that mean that those with Linux are more ethical (not the sharing part, but the not wanting others to use the app to create fake nudes of their mother in-law) or just more selfish than Windows users?

      --
      No, no, you're not thinking; you're just being logical. --Niels Bohr
      • (Score: 4, Interesting) by takyon on Friday June 28 2019, @02:40PM (1 child)

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday June 28 2019, @02:40PM (#860936) Journal

        Yeah. Nice that the Estonians could get a little payday. But it's going to take open source to keep this type of application widely available, uncensorable, and always improving.

        There's no rush. Give it a year or two and it will exist, along with a new generation of GPUs to power it.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 4, Informative) by NotSanguine on Friday June 28 2019, @02:47PM

          There's no rush. Give it a year or two and it will exist, along with a new generation of GPUs to power it.

          Apparently, the app was adapted from pix2pix [github.io], which *is* open source.

          I suspect that within that year or two, well have apps like this for video too. And even if you don't have a lot of GPU juice, it will just take longer. In 5-10 years, your standard desktop/laptop will be able to gin up videos in minutes, I'm sure.

          --
          No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 2) by bzipitidoo on Friday June 28 2019, @02:18PM (11 children)

    by bzipitidoo (4388) on Friday June 28 2019, @02:18PM (#860927) Journal

    This Deep Nudes, and many other potentially humiliating things, are just over the horizon if they aren't already here. I see no practical way to stop any of it. Should we want to stop it? Regardless, we'll simply have to adjust.

    How about a "Deep Youth"? An app that can make an old person look young. Maybe it uses an old photo and changes the clothes and background to whatever is in fashion currently, or a recent photo and smooths away the wrinkles, moles, and scars, fills in bald spots, straightens stooped shoulders, and so on.

    On an online forum devoted to ideas for inventions, one of the women posted a wish for "robot hubby". Robot Hubby would never whine or complain, or be lazy and sit around drinking beers while dirty dishes were overflowing the sink and trash was overflowing the waste baskets. She was especially interested in the ability to turn the robot off and shove it (him?) into a closet for however long she wished.

    • (Score: 2, Insightful) by Anonymous Coward on Friday June 28 2019, @02:27PM (2 children)

      by Anonymous Coward on Friday June 28 2019, @02:27PM (#860930)

      On an online forum devoted to ideas for inventions, one of the women posted a wish for "robot hubby". Robot Hubby would never whine or complain, or be lazy and sit around drinking beers while dirty dishes were overflowing the sink and trash was overflowing the waste baskets. She was especially interested in the ability to turn the robot off and shove it (him?) into a closet for however long she wished.

      Well, I've been looking for a live-in, naked maid for a while now. But I'd have a kennel for her, not a closet. With a nice, comfy cushion in it. I'm sure we'll both get what we want eventually.:)

      • (Score: 3, Informative) by krishnoid on Friday June 28 2019, @08:14PM (1 child)

        by krishnoid (1156) on Friday June 28 2019, @08:14PM (#861094)

        The maids I'm thinking of that meet your criteria also pre-wash the stuff loaded into the dishwasher, immediately clean any food that falls on the floor, and play fetch. So I think you can have what you want even today.

        • (Score: 0) by Anonymous Coward on Saturday June 29 2019, @03:06AM

          by Anonymous Coward on Saturday June 29 2019, @03:06AM (#861214)

          The maids I'm thinking of that meet your criteria also pre-wash the stuff loaded into the dishwasher, immediately clean any food that falls on the floor, and play fetch. So I think you can have what you want even today.

          I'm thinking of the human female sort, as I don't swing that way.

    • (Score: 3, Interesting) by takyon on Friday June 28 2019, @02:42PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday June 28 2019, @02:42PM (#860937) Journal

      Should we want to stop it?

      Don't even ask. Because that path includes police breaking down doors because people wrote or distributed code (not even malware).

      Which is probably what will happen when "strong AI" becomes viable. Coders are the new terrorists.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by Bot on Friday June 28 2019, @04:32PM (2 children)

      by Bot (3902) on Friday June 28 2019, @04:32PM (#860976) Journal

      I can't wait for a decent version to come up. I guess Google goggles is a good candidate for the cloud app. This will render Actual nudes possibly take, so it makes it impossible to blackmail people with it.

      --
      Account abandoned.
    • (Score: 0) by Anonymous Coward on Friday June 28 2019, @04:43PM

      by Anonymous Coward on Friday June 28 2019, @04:43PM (#860983)

      I see no practical way to stop any of it.

      If you can't beat 'em, join 'em!

      Get ready for my next journal post, from Anonymous Foreskin.

    • (Score: 2) by tangomargarine on Friday June 28 2019, @06:18PM

      by tangomargarine (667) on Friday June 28 2019, @06:18PM (#861031)

      I see no practical way to stop any of it. Should we want to stop it? Regardless, we'll simply have to adjust.

      It is somewhat tempting to apply this as a lesson in "don't believe everything you see/hear, dolts" but not sure how many would actually get it.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 2) by darkfeline on Friday June 28 2019, @09:55PM

      by darkfeline (1030) on Friday June 28 2019, @09:55PM (#861134) Homepage

      The difference between reality and fantasy is that ignoring reality doesn't make it go away.

      You can't ban technology/knowledge unless you completely lock down society like North Korea does, and even then, there are people who wise up and try to escape.

      Trying to "stop" technological advances is sticking your head in the sand. But reality doesn't go away.

      --
      Join the SDF Public Access UNIX System today!
    • (Score: 2) by Runaway1956 on Saturday June 29 2019, @12:13AM

      by Runaway1956 (2926) Subscriber Badge on Saturday June 29 2019, @12:13AM (#861174) Journal

      many other potentially humiliating things, are just over the horizon

      So, the TSA has it at your local airport?

  • (Score: 2) by SomeGuy on Friday June 28 2019, @02:29PM (1 child)

    by SomeGuy (5632) on Friday June 28 2019, @02:29PM (#860931)

    Someone from the TSA is knocking on their door with a wad of cash.

    You know, there are plenty of other nefarious ways a tool like this could modify a person's image. Perhaps make someone look like they are badly beaten up or dead, perhaps swap gender, make someone look fat/pregnant/ugly(er), make someone look like they are using last year's cell phone, the list goes on. Not that people haven't been doing that with Photoshop for ages.

    • (Score: 2) by NotSanguine on Friday June 28 2019, @02:34PM

      Not that people haven't been doing that with Photoshop for ages.

      Actually, one of the other articles about this (Vice? The Verge?) mentioned exactly that.

      The difference is that you actually have to learn how to use Photoshop and the like, whereas the only skill this requires is clicking a button with a mouse.

      --
      No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 0) by Anonymous Coward on Friday June 28 2019, @03:19PM (2 children)

    by Anonymous Coward on Friday June 28 2019, @03:19PM (#860950)

    No image or video can be trusted as real unless verified by some forensic team
    The earliest sci-fi pointing this out that I can remember: running man
    Not just that, it nails other things too
    https://m.youtube.com/watch?v=yH7e8UUaB6Q [youtube.com]

  • (Score: 5, Interesting) by tangomargarine on Friday June 28 2019, @06:23PM (1 child)

    by tangomargarine (667) on Friday June 28 2019, @06:23PM (#861038)

    Scarlett Johansson, a frequent subject of deepfake porn, spoke publicly about the subject to The Washington Post in December 2018. In a prepared statement, she expressed concern about the phenomenon, describing the internet as a "vast wormhole of darkness that eats itself." However, she also stated that she wouldn't attempt to remove any of her deepfakes, due to her belief that they don't affect her public image and that differing laws across countries and the nature of internet culture make any attempt to remove the deepfakes "a lost cause"; she believes that while celebrities like herself are protected by their fame, deepfakes pose a grave threat to women of lesser prominence who could have their reputations damaged by depiction in involuntary deepfake pornography or revenge porn.[20]

    Holy shit, a celebrity that actually has a reasonable stance on a tech issue

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 2, Funny) by Anonymous Coward on Friday June 28 2019, @11:52PM

      by Anonymous Coward on Friday June 28 2019, @11:52PM (#861167)

      And she's hot too... I've seen naked pics.

  • (Score: 2) by krishnoid on Friday June 28 2019, @07:44PM

    by krishnoid (1156) on Friday June 28 2019, @07:44PM (#861073)

    Shouldn't a victim be able to challenge any such photo's validity, if not prurience value, by demanding a time and place the photo was taken? That would at least allow for the ability to provide an alibi, and require that people up their game. "I totally wasn't in that strip club at that time, because I was ... uh let me check my schedule ... colluding with a Kazakh operative to influence the next US election. At least he said he was, he had this weird-ass mustache and asked me a lot of really uncomfortable questions."

  • (Score: 2, Insightful) by shrewdsheep on Friday June 28 2019, @09:02PM

    by shrewdsheep (5215) on Friday June 28 2019, @09:02PM (#861117)

    ... the killer app for Google Glass.

  • (Score: 4, Interesting) by jmorris on Saturday June 29 2019, @12:58AM (1 child)

    by jmorris (4844) on Saturday June 29 2019, @12:58AM (#861187)

    If you think this first appearance of a tool like this is scary, just wait. Give Moore's Law a few years to provide and some butthole will release a VLC plugin that will make every woman (or man if that is yer thing) bare ass naked. In realtime. Just watch TV through it, play YouTube, BluRay, etc. and naked people everywhere. Because they can. Because it will outrage people. Because enough people are porn addled pervs. And there isn't a whole lot that can be done to stop it.

    Welcome to the future. Still no moon base in sight, no flying cars, but we will sorta get those X-Ray specs that were on the back cover of the comic books. Not what we dreamed of when we were kids is it?

(1)