Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday January 31 2018, @06:23AM   Printer-friendly
from the fake-news-isn't-fake-news-now dept.

The messaging platform Discord has taken down a channel that was being used to share and spread AI-edited pornographic videos:

Last year, a Reddit user known as "deepfakes" used machine learning to digitally edit the faces of celebrities into pornographic videos, and a new app has made the process much easier to create and spread the videos online. on Friday, chat service Discord shut down a user-created group that was spreading the videos, citing their policy against revenge porn.

Discord is a free chat platform that caters to gamers, and has a poor track record when it comes to dealing with abuse and toxic communities. After it was contacted by Business Insider, the company took down the chat group, named "deepfakes."

Discord is a Skype/TeamSpeak/Slack alternative. Here are some /r/deepfakes discussions about the Discord problem.

One take is that there is no recourse for "victims" of AI-generated porn, at least in the U.S.:

People Can Put Your Face on Porn—and the Law Can't Help You

To many vulnerable people on the internet, especially women, this looks a whole lot like the end times. "I share your sense of doom," Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, and also serves as the tech and legislative policy advisor for the Cyber Civil Rights Initiative. "I think it is going to be that bad."

Merkel Trump Deepfake

Previously: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Informative) by julian on Wednesday January 31 2018, @05:10PM (1 child)

    by julian (6003) Subscriber Badge on Wednesday January 31 2018, @05:10PM (#631023)

    The Natalie Portman nude photoshoot fake is probably the best one I've seen. It should still be up on the subreddit but I'm not going to check at work. Even that one isn't perfect and now that I know what to look for it's easy to spot. They're going to get better, it's early days still, but people will also get better at spotting fakes. There's also the fact that this technique requires an original source video to use as the "body double". So you have to find a video that's already been made with a model as close to the target as possible. That source material isn't always available or close enough to be believable.

    You also need a large (multiple thousands) library of high quality, diverse, face pictures of the person you want to fake. That's easy for celebrities who get their photo taken all the time but might be harder to get for normal people.

    Starting Score:    1  point
    Moderation   +3  
       Interesting=1, Informative=2, Total=3
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 0) by Anonymous Coward on Thursday February 01 2018, @06:03AM

    by Anonymous Coward on Thursday February 01 2018, @06:03AM (#631357)

    You also need a large (multiple thousands) library of high quality, diverse, face pictures of the person you want to fake. That's easy for celebrities who get their photo taken all the time but might be harder to get for normal people.

    Right. Unless they're a social media whore. And that's only roughly 1/7th of the global population.