Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday January 26 2018, @10:39PM   Printer-friendly
from the porn-driving-innovation dept.

Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence.

Back in December, the unsavory hobby of a Reddit user by the name of deepfakes became a new centerpiece of artificial intelligence debate, specifically around the newfound ability to face-swap celebrities and porn stars. Using software, deepfakes was able to take the face of famous actresses and swap them with those of porn actresses, letting him live out a fantasy of watching famous people have sex. Now, just two months later, easy-to-use applications have sprouted up with the ability to perform this real-time editing with even more ease, according to Motherboard, which also first reported about deepfakes late last year.

Thanks to AI training techniques like machine learning, scores of photographs can be fed into an algorithm that creates convincing human masks to replace the faces of anyone on video, all by using lookalike data and letting the software train itself to improve over time. In this case, users are putting famous actresses into existing adult films. According to deepfakes, this required some extensive computer science know-how. But Motherboard reports that one user in the burgeoning community of pornographic celebrity face swapping has created a user-friendly app that basically anyone can use.

The same technique can be used for non-pornographic purposes, such as inserting Nicolas Cage's face into classic movies. One user also "outperformed" the Princess Leia scene at the end of Disney's Rogue One (you be the judge, original footage is at the top of the GIF).

The machines are learning.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by looorg on Saturday January 27 2018, @08:43AM (2 children)

    by looorg (578) on Saturday January 27 2018, @08:43AM (#628769)

    Does it somehow become better when you spank it to fake celeb pictures then to "normal" porn stars? Don't think I really got the celeb-fetish. I assume famous people fuck to, the enormous amount of "accidental nude selfies and videos" that get leaked seems to indicate that, but I just don't see them being more spank worthy then something else.

    I guess the bit of interesting thing here then would be if the program is now so easy to use or the results are indistinguishable from reality. Then we can't really trust anything anymore in the digital world. Which might be both scarey and liberating.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by takyon on Saturday January 27 2018, @09:56AM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday January 27 2018, @09:56AM (#628784) Journal

    Some faces just look better than others. There may be a humiliation/revenge aspect too, such as getting back at actresses and actors who have opinions you don't like. Others may not care about the celebs much but want to elicit any kind of reaction from them. Now that this has hit a few news sites, it's only a matter of time before a celeb gives their [amused|outraged|tearful] take on the fake porn they have been edited into.

    Some have an academic interest: "Can it be done? How well can it be done?" Some want to usher in an age where all audio/video evidence is seen as potentially fake. That kind of undermines the explosion of CCTV, dashcams, etc., especially when so much stuff is hackable (plausible to swap a real video in storage for a fake).

    Some want to see if the U.S. California will try to ban this practice, setting up a 1st Amendment legal battle.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by takyon on Saturday January 27 2018, @11:30AM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday January 27 2018, @11:30AM (#628803) Journal

    Another thing: while this could be used for revenge porn and humiliation against individuals who aren't famous, it is easier to use celebrities, politicians, etc. because they have put themselves out there in the public sphere so much, generating plenty of high-quality images and videos to be used for training data.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]