Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday July 11 2019, @01:53AM   Printer-friendly
from the information-wants-to-be-[clothing]-free dept.

Github is banning copies of 'deepfakes' porn app DeepNude

GitHub is banning code from DeepNude, the app that used AI to create fake nude pictures of women. Motherboard, which first reported on DeepNude last month, confirmed that the Microsoft-owned software development platform won't allow DeepNude projects. GitHub told Motherboard that the code violated its rules against "sexually obscene content," and it's removed multiple repositories, including one that was officially run by DeepNude's creator.

DeepNude was originally a paid app that created nonconsensual nude pictures of women using technology similar to AI "deepfakes." The development team shut it down after Motherboard's report, saying that "the probability that people will misuse it is too high." However, as we noted last week, copies of the app were still accessible online — including on GitHub.

Late that week, the DeepNude team followed suit by uploading the core algorithm (but not the actual app interface) to the platform. "The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code," wrote the team on a now-deleted page. "DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects."

Also at The Register, Vice, and Fossbytes.

Previously: "Deep Nude" App Removed By Developers After Brouhaha

Related: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"
My Struggle With Deepfakes
Deep Fakes Advance to Only Needing a Single Two Dimensional Photograph


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Knowledge Troll on Thursday July 11 2019, @05:32AM (4 children)

    by Knowledge Troll (5948) on Thursday July 11 2019, @05:32AM (#865694) Homepage Journal

    This isn't some generic image manipulation application.

    I've been saying that this is special purpose software made for the exact use case of putting a head you want to see on another body you want to see. That part is not lost on me.

    nor is it a tool whose primary applications are likely to be nefarious.

    I think the primary use case would be watching the porn yourself. Only a very small fraction of people will ever share that. This isn't very much different from cutting pictures out of playboys to match up with heads off other photographs. It's not a thing I do but if someone does that in their home I don't care. I also don't care if someone watches a DeepFake in their own house. That doesn't harm anyone except possibly the individual themselves depending on how you feel about pornography. The software has been convicted of being obscene itself because some people might do some obscene stuff with it.

    This isn't some generic image manipulation application.

    The very special use case software here has an ML engine with usages beyond tit replacement. We are losing value here - hopefully someone will make use of it from an archive somewhere else if that value is high.

    Is there no such application that you'd find offensive enough that a developer should "lose sleep over it"?

    I can't think of one no. I'm sure it exists but when I see stuff I don't agree with I usually just move my eyes away from it.

    If, instead, someone created an application that is SOLELY designed to dress a person up in a Nazi uniform and make it appear as if they were in old film footage in the concentration camps in WWII beating and killing Jews, I think we might rightly ask what the application is for and whether it is something we want to support.

    Even that doesn't bug me. I wouldn't be a part of making it because that's not my style and I find no redeeming qualities at all with it. Would I ban it from something I hosted? I'm not sure. Probably not of my own volition. Depending on how much I like money I might cave to financial pressure like Github did if I needed to. I would call that financial interests corrupting opensource though.

    I also thought removing weboob from Debian was an epic waste of energy on Debian's part. The people advocating for it's removal acted like the software itself was some kind of burden on the Debian project and they cited needing to prep it for release as a waste of time because the software is gross and just doesn't belong here. That's a false concern though: packages are maintained by the package maintainer that volunteers and no one can make a debian dev do anything. If a package has open bugs before the release can be made the package is not included. It cost the Debian project more energy to discuss removing it than it took to just leave it.

    I don't really understand this recent censorship push in opensource.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Thursday July 11 2019, @03:57PM (3 children)

    by Anonymous Coward on Thursday July 11 2019, @03:57PM (#865836)

    It is basically the same reason that bigoted conservatives don't like being called Nazis or even just alt-right. This same weird stance you have with open source is quite similar to the free speech zealots. Nuance and context don't matter to such people and thus reality appesrs very strange to them.

    • (Score: 2) by Knowledge Troll on Thursday July 11 2019, @05:15PM (2 children)

      by Knowledge Troll (5948) on Thursday July 11 2019, @05:15PM (#865861) Homepage Journal

      It is basically the same reason that bigoted conservatives don't like being called Nazis or even just alt-right.

      If your definition of Nazi only requires a person to be bigoted they might not like the utter abuse of the word and not the labeling of themselves.

      • (Score: 0) by Anonymous Coward on Friday July 12 2019, @04:39PM (1 child)

        by Anonymous Coward on Friday July 12 2019, @04:39PM (#866301)

        Look at you, white knight of the neo-nazis. Milo is that you?

        • (Score: 0) by Anonymous Coward on Saturday July 13 2019, @07:59AM

          by Anonymous Coward on Saturday July 13 2019, @07:59AM (#866525)

          WW2 is over and She Lost.