Katyanna Quach over at El Reg is reporting on the removal of the DeepNude Web and desktop apps [theregister.co.uk] from the developers' website. DeepNude is an application that takes photos of clothed women (apparently, the app does not function properly with photos of males -- there's a shocker!), digitally removes clothing and adds realistic looking naughty bits.
From the article [theregister.co.uk]:
A machine-learning-powered perv super-tool that automagically removed clothes from women in photos to make them appear naked has been torn offline by its makers.
The shamefaced creators of the $50 Windows and Linux desktop app DeepNude claimed they were overwhelmed by demand from internet creeps: the developers' servers apparently buckled under a stampede of downloads, their buggy software generated more crash reports than they could deal with, and this all came amid a firestorm of social media outrage.
[...]
Basement dwellers and trolls could feed it snaps of celebrities, colleagues, ex-girlfriends, and anyone else who takes their fancy, and have the software guess, somewhat badly, what they look like underneath their clothes, keeping their faces intact. These bogus nudes are perfect for distributing around the 'net to humiliate victims.There was so much interest in this misogynistic piece of crap that the site’s servers couldn’t handle the traffic and crashed, it is claimed. The team initially said on Thursday they were trying to fix bugs, and expected everything to be up and running again in a few days:
Hi! DeepNude is offline. Why? Because we did not expect these visits and our servers need reinforcement. We are a small team. We need to fix some bugs and catch our breath. We are working to make DeepNude stable and working. We will be back online soon in a few days.
— deepnudeapp (@deepnudeapp) June 27, 2019 [twitter.com]Shortly after that message, they changed their tune. Instead of trying to bring it back online, the developers decided to pull the plug on deepnude.com completely, kill off distribution of the code, and hope the scandal just goes away.
“The world is not yet ready for DeepNude,” the team, based in Estonia, said on Thursday. Or rather, quite likely, the team wasn't ready for all the criticism and rage lobbed its way on Twitter, Facebook, and elsewhere, as a result of its work.
It's unsurprising that an application with this big a potential for abuse would cause such outrage. Of course, it's not really gone, as it's still available from various torrent sites.
So what say you? Obviously, the genie can't be put back in the bottle, so the (as the author of TFA put it) "Basement dwellers and trolls" will be creating naked pics of, well, everyone, for a long time to come.
Of course DeepFake [wikipedia.org] video can have your exes and your friends' moms/daughters/grandmothers engaging in hardcore porn, but those techniques aren't (yet) available to the masses. This app, however, can be used by just about anyone *right now*.
What will this do to the quality of still image soft-core porn? Will the courts get involved? How should this be dealt with (if at all)?
Could widespread use of tools like this (and there will be more, of course), finally change how the hoi-polloi protect their digital images?
Bonus question: Whose photo(s) will *you* run through this software?
Other coverage:
https://www.theverge.com/2019/6/27/18761496/deepnude-shuts-down-deepfake-nude-ai-app-women [theverge.com]
https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman [vice.com]
https://www.vice.com/en_us/article/qv7agw/deepnude-app-that-undresses-photos-of-women-takes-it-offline [vice.com]
https://www.dailydot.com/debug/deepnude-app-pulled-offline/ [dailydot.com]