Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Thursday December 14 2017, @05:00PM   Printer-friendly
from the ohoh dept.

Submitted via IRC for TheMightyBuzzard

Someone used an algorithm to paste the face of 'Wonder Woman' star Gal Gadot onto a porn video, and the implications are terrifying.

There's a video of Gal Gadot having sex with her stepbrother on the internet. But it's not really Gadot's body, and it's barely her own face. It's an approximation, face-swapped to look like she's performing in an existing incest-themed porn video.

[...] Like the Adobe tool that can make people say anything, and the Face2Face algorithm that can swap a recorded video with real-time face tracking, this new type of fake porn shows that we're on the verge of living in a world where it's trivially easy to fabricate believable videos of people doing and saying things they never did. Even having sex.

[...] The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016.

Source: AI-Assisted Fake Porn Is Here and We're All Fucked


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Nuke on Thursday December 14 2017, @09:01PM (1 child)

    by Nuke (3162) on Thursday December 14 2017, @09:01PM (#609868)

    Whether an image, or a stream of images in a video, there's no physical harm that takes place when photons hit a person's retina in any given pattern... it won't be long at all before such things can be forged, more or less effortlessly. .... I think we need to readjust the attitude that images represent truth - because it won't be long before false images are just as easy to make as true ones.

    Sounds fine and dandy, but there is plenty of harm that can take place that is not "physical", like losing your job or causing a split with your partner.

    What you say about being easy to forge etc has been the case with speech and writing ever since they were invented - ie the ability to tell lies. The fact that people are familiar with the existence of lies (just like you say they will become familiar with false videos) does not stop them from believing them often. In fact the ability to falsify "an image, or a stream of images in a video" (as you say) has been around for years; false porn has been used in the past to embarrass or bring down politicians etc by using look-alike actors for example, and entire "historical" films used to falsify, sanitise or demonise past events in the public perception.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Insightful) by JoeMerchant on Thursday December 14 2017, @11:01PM

    by JoeMerchant (3937) on Thursday December 14 2017, @11:01PM (#609958)

    there is plenty of harm that can take place that is not "physical", like losing your job or causing a split with your partner.

    This is where society needs to grow up, because these things can and do still happen. The problem is when people believe that a false image is true. The person creating the false image is lying, just the same as speaking false statements - there's really no distinction except that for the moment only a few people are capable of lying this way, that's going to change soon to where anybody can create a "lying" video as easily as speaking a false story - and that's what society needs to get its head around.

    False images have been made since before the camera, but this recent era of mechanical reproduction has led people to the erroneous assumption that "pictures don't lie." The can, and they will be lying a lot more easily in the future.

    --
    🌻🌻 [google.com]