Github is banning copies of 'deepfakes' porn app DeepNude [theverge.com]
GitHub is banning code from DeepNude, the app that used AI to create fake nude pictures of women. Motherboard, which first reported on DeepNude [vice.com] last month, confirmed that [vice.com] the Microsoft-owned software development platform won't allow DeepNude projects. GitHub told Motherboard that the code violated its rules against "sexually obscene content," and it's removed multiple repositories, including one that was officially run by DeepNude's creator.
DeepNude was originally a paid app that created nonconsensual nude pictures of women using technology similar to AI "deepfakes." [theverge.com] The development team shut it down after Motherboard's report, saying that "the probability that people will misuse it is too high." However, as we noted last week [theverge.com], copies of the app were still accessible online — including on GitHub.
Late that week, the DeepNude team followed suit by uploading the core algorithm (but not the actual app interface) to the platform. "The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code," wrote the team on a now-deleted page. "DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects."
Previously: "Deep Nude" App Removed By Developers After Brouhaha [soylentnews.org]
Related: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit [soylentnews.org]
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn" [soylentnews.org]
My Struggle With Deepfakes [soylentnews.org]
Deep Fakes Advance to Only Needing a Single Two Dimensional Photograph [soylentnews.org]