Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by Fnord666 on Friday January 26 2018, @10:39PM   Printer-friendly
from the porn-driving-innovation dept.

Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence.

Back in December, the unsavory hobby of a Reddit user by the name of deepfakes became a new centerpiece of artificial intelligence debate, specifically around the newfound ability to face-swap celebrities and porn stars. Using software, deepfakes was able to take the face of famous actresses and swap them with those of porn actresses, letting him live out a fantasy of watching famous people have sex. Now, just two months later, easy-to-use applications have sprouted up with the ability to perform this real-time editing with even more ease, according to Motherboard, which also first reported about deepfakes late last year.

Thanks to AI training techniques like machine learning, scores of photographs can be fed into an algorithm that creates convincing human masks to replace the faces of anyone on video, all by using lookalike data and letting the software train itself to improve over time. In this case, users are putting famous actresses into existing adult films. According to deepfakes, this required some extensive computer science know-how. But Motherboard reports that one user in the burgeoning community of pornographic celebrity face swapping has created a user-friendly app that basically anyone can use.

The same technique can be used for non-pornographic purposes, such as inserting Nicolas Cage's face into classic movies. One user also "outperformed" the Princess Leia scene at the end of Disney's Rogue One (you be the judge, original footage is at the top of the GIF).

The machines are learning.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by edIII on Saturday January 27 2018, @03:49AM

    by edIII (791) on Saturday January 27 2018, @03:49AM (#628695)

    What's so funny is that his theory has worked from the beginning. Back in the 80's they have the same stuff, except it was like, far more hilarious. Like maybe an order of magnitude better than pasting a magazine cutout on the monitor, but not much better. Then the lines started being blurred on the neck at least, and from there it was iterative improvements to the point we need extensive AI analysis to identify a fraud.

    I honestly wonder if will get to the point where we dismiss visual and audio evidence as something akin to hearsay, while giving weight only to that which has been verified. Even then, who is to say that the cryptographically signed surveillance video wasn't doctored at it's inputs and is passing a transcoded stream? Security cameras will have to be tamper-proof and audited to be legally valid. Given our hilarious and almost scary lack of security right now, how could we ever say conclusively, the shit wasn't modified? It's going to be degree of confidences at best.

    Didn't some researcher crack something about being able create any protein, and they're working on a way of scaling the process? How much longer till you can replicate DNA and evidence to be placed somewhere?

    With plastic surgery getting better day by day, you might be fucking this celebrity and still be wondering if she isn't a fembot quietly stealing your cryptocurrency.

    You just can't trust shit about shit about shit :)

    --
    Technically, lunchtime is at any moment. It's just a wave function.
    Starting Score:    1  point
    Moderation   +2  
       Interesting=1, Informative=1, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4