Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Thursday December 14 2017, @05:00PM   Printer-friendly
from the ohoh dept.

Submitted via IRC for TheMightyBuzzard

Someone used an algorithm to paste the face of 'Wonder Woman' star Gal Gadot onto a porn video, and the implications are terrifying.

There's a video of Gal Gadot having sex with her stepbrother on the internet. But it's not really Gadot's body, and it's barely her own face. It's an approximation, face-swapped to look like she's performing in an existing incest-themed porn video.

[...] Like the Adobe tool that can make people say anything, and the Face2Face algorithm that can swap a recorded video with real-time face tracking, this new type of fake porn shows that we're on the verge of living in a world where it's trivially easy to fabricate believable videos of people doing and saying things they never did. Even having sex.

[...] The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016.

Source: AI-Assisted Fake Porn Is Here and We're All Fucked


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by bradley13 on Thursday December 14 2017, @06:10PM (6 children)

    by bradley13 (3053) on Thursday December 14 2017, @06:10PM (#609788) Homepage Journal

    "Whether an image, or a stream of images in a video, there's no physical harm that takes place when photons hit a person's retina"

    While I don't necessarily disagree with you, this is a hard concept to sell. Why should porno cartoons with Bart Simpson be illegal? Or certain porn-animes? Want to bet that entirely computer-generated kiddie-porn would be illegal? Want to campaign for legalization, on the entirely believable premise that it would provide an outlet for people who otherwise victimize real kids?

    More personally: Want your wife's/girlfriend's head pasted onto a YouPorn video and sent to the world?

    So, I agree with you, no physical harm. It's far more complicated than that, with no easy answers. What's particularly unfortunate is the fact that the worst opponents you will face, in trying to come to a reasonable solution, are the people who think that the prohibition is the answer.

    --
    Everyone is somebody else's weirdo.
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by takyon on Thursday December 14 2017, @06:39PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday December 14 2017, @06:39PM (#609796) Journal
  • (Score: 5, Insightful) by Grishnakh on Thursday December 14 2017, @07:07PM (2 children)

    by Grishnakh (2831) on Thursday December 14 2017, @07:07PM (#609809)

    More personally: Want your wife's/girlfriend's head pasted onto a YouPorn video and sent to the world?

    I'm going to step in here and opine that this is actually a different issue. It's like slander/libel: it constitutes character defamation. Saying lies about something that doesn't affect anyone ("there's aliens living on Mars!!") is probably mostly harmless, but saying lies about an actual person can damage their reputation and livelihood, cause them to be targeted for harassment, etc. Indeed, we already have laws against using someone's likeness without permission: you can't just photoshop a picture of some famous person doing something out of character and publish it; they'll sue for libel and win. This was the case even before Photoshop, as it's been possible to doctor photos for a long time through manual means. So I don't see how this face-swapping software really changes anything, except making it possible to create fake video, which before wasn't very feasible (e.g. showing video of someone doing something illegal, when in fact it was some other person but their face has been changed by software).

    So I think it's entirely reasonable for someone to be opposed to pasting someone's head onto such a video and posting publicly (which constitutes defamation/libel and is illegal now), and not being opposed to totally fake cp that doesn't show any real people, because there's no direct victim in the latter case. The counter-argument, however, from the law enforcement types is probably that they have no reliable way of distinguishing real from fake (though at a guess, I'd imagine that if someone included all the build tools and source data with such video, to show that no sources involved real people, that that could constitute sufficient proof of it being artificially-generated).

    • (Score: 0) by Anonymous Coward on Thursday December 14 2017, @08:05PM

      by Anonymous Coward on Thursday December 14 2017, @08:05PM (#609837)

      can we put politician faces on these, then?

      maybe just their names as senders of text messages saying naughty things to women that are intended to present the texts as evidence?

      I mean its like the kids these days don't even know how to forge an email header anymore

    • (Score: 2) by GreatAuntAnesthesia on Friday December 15 2017, @01:25AM

      by GreatAuntAnesthesia (3275) on Friday December 15 2017, @01:25AM (#610026) Journal

      I'm going to step in here and opine that this is actually a different issue. It's like slander/libel: it constitutes character defamation.

      More to the point, once it becomes widely known and understood that this is possible, nobody will give a shit. OK, there will be a period of a few years where some people think it's funny and others think it's shocking to post your co-worker's / boss' / ex-girlfriend / ex-boyfriend / stalker victim's face onto a porn star, and I'm sure we can all look forward to the subsequent rash of outraged articles and bullied teenagers and apps getting banned and ill-considered reactionary legislation and all the rest of it. However once the novelty wears off people will just kind of move on. Nobody will pay any attention to anybody who says "oh hey, look, here's a video of [whoever] doing three guys and a goat" any more than they'd pay attention to someone who has cut a celebrity's face out of a lifestyle magazine and glued it into a porn magazine. It will just look kind of sad.

      The existence of his capability might even provide cover for those people whose genuine sex videos *do* end up online. "Oh that? No, that's not me, it's just someone has edited my face into a porno. Well yes that does look like my bedroom, they must have edited that too. Isn't it amazing what they can do with technology now?"

  • (Score: 0) by Anonymous Coward on Thursday December 14 2017, @10:42PM

    by Anonymous Coward on Thursday December 14 2017, @10:42PM (#609934)

    Want to campaign for legalization, on the entirely believable premise that it would provide an outlet for people who otherwise victimize real kids?

    Many people who like such drawn porn aren't even interested in real people at all. It may be the case that a very small number of people would otherwise victimize real kids without such things, but again, it's unlikely that that group is anything but minuscule. Merely having a sexual attraction to a group of people is not an indication that you want to rape them.

  • (Score: 3, Interesting) by JoeMerchant on Thursday December 14 2017, @10:48PM

    by JoeMerchant (3937) on Thursday December 14 2017, @10:48PM (#609941)

    The writers for Marlo Thomas' "That Girl" https://www.google.com/search?q=Marlo+thomas+that+girl [google.com] tackled this thorny issue in the 1960s - anybody's face can be attached to anybody's body doing anything... and published.

    The writers had Marlo shrug it off, seeing the bright side that her Ms. February spread only lasted 28 days, that was almost 50 years ago.

    The technology to forge and publish has been democratized, and somehow society hasn't grown up. Want to publish me and my wife on a video? Fine, go ahead - there's no injury until you try to claim that the images are a true record of something that actually happened - once that claim is made, or implied by journalists who reproduce the images, then we've got libel or slander or whatever it's called (I can remember the Spiderman Jameson scene, I just can't remember which way he said it....)

    What society still needs to learn is the difference between something they see in a recorded image and proof that something actually happened. People making false claims that an image really happened will always be in the wrong. People who are hurt by seeing something are going to have a rough time in the future.

    --
    🌻🌻 [google.com]