Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Thursday December 14 2017, @05:00PM   Printer-friendly
from the ohoh dept.

Submitted via IRC for TheMightyBuzzard

Someone used an algorithm to paste the face of 'Wonder Woman' star Gal Gadot onto a porn video, and the implications are terrifying.

There's a video of Gal Gadot having sex with her stepbrother on the internet. But it's not really Gadot's body, and it's barely her own face. It's an approximation, face-swapped to look like she's performing in an existing incest-themed porn video.

[...] Like the Adobe tool that can make people say anything, and the Face2Face algorithm that can swap a recorded video with real-time face tracking, this new type of fake porn shows that we're on the verge of living in a world where it's trivially easy to fabricate believable videos of people doing and saying things they never did. Even having sex.

[...] The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016.

Source: AI-Assisted Fake Porn Is Here and We're All Fucked


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by JoeMerchant on Thursday December 14 2017, @05:17PM (31 children)

    by JoeMerchant (3937) on Thursday December 14 2017, @05:17PM (#609764)

    Whether an image, or a stream of images in a video, there's no physical harm that takes place when photons hit a person's retina in any given pattern (subject to intensity limitations, etc.)

    O.K. - stepping back, some people have photo-sensitive epilepsy, and putting out videos that trigger that where they can be seen by the sensitive population is a bad thing. Beyond that, "harmful imagery" is basically a social construct. I'm not talking about images of harm being done, just harm done by the viewing of images.

    A society that constructs itself to be vulnerable to "harmful imagery" is going to have a bad time in the future when anybody can make any images they want, in 3D 8K 120fps with surround sound, indistinguishable from live captured video.

    Veering of to a related tangent, we're in a (very brief) transitional time here where "forged" videos are detectable, give Moores' law time for another factor of 8 or 16 on the current animation and video processing capabilities and rendering anything you can imagine will be possible with a few voice commands to your mobile device.

    Makes me think of the recent video of some guys dragging a shark, and similar stupidity that gets posted to the internet... it won't be long at all before such things can be forged, more or less effortlessly. Instead of punishing the people shown doing things in videos, or attempting to track down the forgers, or control the tools that make forgery possible, I think we need to readjust the attitude that images represent truth - because it won't be long before false images are just as easy to make as true ones.

    --
    🌻🌻 [google.com]
    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Interesting=2, Overrated=1, Total=5
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by requerdanos on Thursday December 14 2017, @05:49PM (3 children)

    by requerdanos (5997) Subscriber Badge on Thursday December 14 2017, @05:49PM (#609773) Journal

    I think we need to readjust the attitude that images represent truth

    The problem with that idea -- and I don't dispute it -- is that images frequently, in the overwhelming majority of cases, represent truth. A picture, they say, is worth a thousand words. This means that that adjustment will probably be a long time coming.

    because it won't be long before false images are just as easy to make as true ones.

    Well, in terms of still images, that time here in terms of finding the images. Just purchase a magazine, and you'll find that it's actually just as easy to get false images as real ones.

    • (Score: 2) by fyngyrz on Friday December 15 2017, @02:46AM (2 children)

      by fyngyrz (6567) on Friday December 15 2017, @02:46AM (#610058) Journal

      The problem with that idea -- and I don't dispute it -- is that images frequently, in the overwhelming majority of cases, represent truth.

      Clearly, society is going to have to get over that idea. It's only been around for a couple centuries anyway.

      Photgraphs have been subject to fakery almost the entire time anyway... and your presumption of "in the overwhelming majority of cases" been provably wrong since almost the first day that images went high-color digital. I spent most of my career writing image processing software, and we could make you smile when you weren't way back in 1985, not to mention do decent compositing of things that weren't in the original, change colors, etc. By 1995, just one decade later, we could make your head turn while changing expression in a video, turn you into a frog, etc.

      From TFS:

      The ease with which someone could do this is frightening.

      Mostly to a society that is afraid of nudity and sexuality, and which gullibly swallows every bit of agitprop that comes down the line. Dare we hope this will help our neurotic society get over that? There's an awful lot of superstitious harumphing coming from pulpits and indoctrinated parents... it could take a while. We sure could use a lot less gullibility, too. Imagine if grade school was a place where everyone was taught that they actually needed solid, multiply-sourced evidence to make an assumption!

      • (Score: 2) by requerdanos on Friday December 15 2017, @02:26PM (1 child)

        by requerdanos (5997) Subscriber Badge on Friday December 15 2017, @02:26PM (#610291) Journal

        Photgraphs have been subject to fakery almost the entire time anyway... and your presumption of "in the overwhelming majority of cases" been provably wrong since almost the first day that images went high-color digital.

        By images, I don't mean only photographs, nor even principally photographs.

        • (Score: 2) by fyngyrz on Friday December 15 2017, @02:42PM

          by fyngyrz (6567) on Friday December 15 2017, @02:42PM (#610296) Journal

          Movies and videos are just sequences of still images. Altering moving image sequences in ways that are not easily detectable has been done for decades.

  • (Score: 4, Insightful) by Aiwendil on Thursday December 14 2017, @06:04PM

    by Aiwendil (531) on Thursday December 14 2017, @06:04PM (#609783) Journal

    I think we need to readjust the attitude that images represent truth - because it won't be long before false images are just as easy to make as true ones.

    That reminds me of a quote on bash.org [bash.org]:

    <Dugimodo> ever notice miracles stopped happening when the video camera was invented
    <pog> and then they started up again when digital editing came by

  • (Score: 3, Insightful) by bradley13 on Thursday December 14 2017, @06:10PM (6 children)

    by bradley13 (3053) on Thursday December 14 2017, @06:10PM (#609788) Homepage Journal

    "Whether an image, or a stream of images in a video, there's no physical harm that takes place when photons hit a person's retina"

    While I don't necessarily disagree with you, this is a hard concept to sell. Why should porno cartoons with Bart Simpson be illegal? Or certain porn-animes? Want to bet that entirely computer-generated kiddie-porn would be illegal? Want to campaign for legalization, on the entirely believable premise that it would provide an outlet for people who otherwise victimize real kids?

    More personally: Want your wife's/girlfriend's head pasted onto a YouPorn video and sent to the world?

    So, I agree with you, no physical harm. It's far more complicated than that, with no easy answers. What's particularly unfortunate is the fact that the worst opponents you will face, in trying to come to a reasonable solution, are the people who think that the prohibition is the answer.

    --
    Everyone is somebody else's weirdo.
    • (Score: 2) by takyon on Thursday December 14 2017, @06:39PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday December 14 2017, @06:39PM (#609796) Journal
    • (Score: 5, Insightful) by Grishnakh on Thursday December 14 2017, @07:07PM (2 children)

      by Grishnakh (2831) on Thursday December 14 2017, @07:07PM (#609809)

      More personally: Want your wife's/girlfriend's head pasted onto a YouPorn video and sent to the world?

      I'm going to step in here and opine that this is actually a different issue. It's like slander/libel: it constitutes character defamation. Saying lies about something that doesn't affect anyone ("there's aliens living on Mars!!") is probably mostly harmless, but saying lies about an actual person can damage their reputation and livelihood, cause them to be targeted for harassment, etc. Indeed, we already have laws against using someone's likeness without permission: you can't just photoshop a picture of some famous person doing something out of character and publish it; they'll sue for libel and win. This was the case even before Photoshop, as it's been possible to doctor photos for a long time through manual means. So I don't see how this face-swapping software really changes anything, except making it possible to create fake video, which before wasn't very feasible (e.g. showing video of someone doing something illegal, when in fact it was some other person but their face has been changed by software).

      So I think it's entirely reasonable for someone to be opposed to pasting someone's head onto such a video and posting publicly (which constitutes defamation/libel and is illegal now), and not being opposed to totally fake cp that doesn't show any real people, because there's no direct victim in the latter case. The counter-argument, however, from the law enforcement types is probably that they have no reliable way of distinguishing real from fake (though at a guess, I'd imagine that if someone included all the build tools and source data with such video, to show that no sources involved real people, that that could constitute sufficient proof of it being artificially-generated).

      • (Score: 0) by Anonymous Coward on Thursday December 14 2017, @08:05PM

        by Anonymous Coward on Thursday December 14 2017, @08:05PM (#609837)

        can we put politician faces on these, then?

        maybe just their names as senders of text messages saying naughty things to women that are intended to present the texts as evidence?

        I mean its like the kids these days don't even know how to forge an email header anymore

      • (Score: 2) by GreatAuntAnesthesia on Friday December 15 2017, @01:25AM

        by GreatAuntAnesthesia (3275) on Friday December 15 2017, @01:25AM (#610026) Journal

        I'm going to step in here and opine that this is actually a different issue. It's like slander/libel: it constitutes character defamation.

        More to the point, once it becomes widely known and understood that this is possible, nobody will give a shit. OK, there will be a period of a few years where some people think it's funny and others think it's shocking to post your co-worker's / boss' / ex-girlfriend / ex-boyfriend / stalker victim's face onto a porn star, and I'm sure we can all look forward to the subsequent rash of outraged articles and bullied teenagers and apps getting banned and ill-considered reactionary legislation and all the rest of it. However once the novelty wears off people will just kind of move on. Nobody will pay any attention to anybody who says "oh hey, look, here's a video of [whoever] doing three guys and a goat" any more than they'd pay attention to someone who has cut a celebrity's face out of a lifestyle magazine and glued it into a porn magazine. It will just look kind of sad.

        The existence of his capability might even provide cover for those people whose genuine sex videos *do* end up online. "Oh that? No, that's not me, it's just someone has edited my face into a porno. Well yes that does look like my bedroom, they must have edited that too. Isn't it amazing what they can do with technology now?"

    • (Score: 0) by Anonymous Coward on Thursday December 14 2017, @10:42PM

      by Anonymous Coward on Thursday December 14 2017, @10:42PM (#609934)

      Want to campaign for legalization, on the entirely believable premise that it would provide an outlet for people who otherwise victimize real kids?

      Many people who like such drawn porn aren't even interested in real people at all. It may be the case that a very small number of people would otherwise victimize real kids without such things, but again, it's unlikely that that group is anything but minuscule. Merely having a sexual attraction to a group of people is not an indication that you want to rape them.

    • (Score: 3, Interesting) by JoeMerchant on Thursday December 14 2017, @10:48PM

      by JoeMerchant (3937) on Thursday December 14 2017, @10:48PM (#609941)

      The writers for Marlo Thomas' "That Girl" https://www.google.com/search?q=Marlo+thomas+that+girl [google.com] tackled this thorny issue in the 1960s - anybody's face can be attached to anybody's body doing anything... and published.

      The writers had Marlo shrug it off, seeing the bright side that her Ms. February spread only lasted 28 days, that was almost 50 years ago.

      The technology to forge and publish has been democratized, and somehow society hasn't grown up. Want to publish me and my wife on a video? Fine, go ahead - there's no injury until you try to claim that the images are a true record of something that actually happened - once that claim is made, or implied by journalists who reproduce the images, then we've got libel or slander or whatever it's called (I can remember the Spiderman Jameson scene, I just can't remember which way he said it....)

      What society still needs to learn is the difference between something they see in a recorded image and proof that something actually happened. People making false claims that an image really happened will always be in the wrong. People who are hurt by seeing something are going to have a rough time in the future.

      --
      🌻🌻 [google.com]
  • (Score: 2, Disagree) by DeathMonkey on Thursday December 14 2017, @06:45PM (1 child)

    by DeathMonkey (1380) on Thursday December 14 2017, @06:45PM (#609799) Journal

    Whether an image, or a stream of images in a video, there's no physical harm that takes place when photons hit a person's retina in any given pattern

    Libel and slander don't cause physical harm either but they're both illegal for good reasons.

    • (Score: 2) by JoeMerchant on Thursday December 14 2017, @10:50PM

      by JoeMerchant (3937) on Thursday December 14 2017, @10:50PM (#609946)

      It's only illegal if you claim something is true when it isn't.

      Parody is protected.

      --
      🌻🌻 [google.com]
  • (Score: 0) by Anonymous Coward on Thursday December 14 2017, @07:13PM (3 children)

    by Anonymous Coward on Thursday December 14 2017, @07:13PM (#609815)

    "no physical harm that takes place when photons hit a person's retina in any given pattern"

    not true. you can rewire the brain with images.

    • (Score: 0) by Anonymous Coward on Thursday December 14 2017, @07:15PM

      by Anonymous Coward on Thursday December 14 2017, @07:15PM (#609819)

      And I will. So many brains out there to spoil.

    • (Score: 0) by Anonymous Coward on Thursday December 14 2017, @10:44PM

      by Anonymous Coward on Thursday December 14 2017, @10:44PM (#609937)

      That's the viewer's own problem, unless they are literally forced to view the images.

    • (Score: 2) by JoeMerchant on Thursday December 14 2017, @10:55PM

      by JoeMerchant (3937) on Thursday December 14 2017, @10:55PM (#609952)

      Ala Clockwork Orange...

      This mostly works because people have been taught to hate and fear the very idea of certain things.

      While advertising, marketing, etc. all try to reprogram people every day all day long, they have more of an influencing effect than a controlling one.

      What needs to be preserved is the freedom of choice not to look - something that's missing in airport lounges where one is forced to listen to CNN for hours on end.

      --
      🌻🌻 [google.com]
  • (Score: 1) by Gault.Drakkor on Thursday December 14 2017, @07:30PM (2 children)

    by Gault.Drakkor (1079) on Thursday December 14 2017, @07:30PM (#609828)

    As the cost of synthetic creation and manipulation becomes cheaper...

    What this will probably come to is the content+geo-loc+time must be crypto signed to be considered real. You can't prevent world read+edit. But we should be able to optionally have cryptographically signed media(signed at creation time). I know that there is water marks and other ways to tie content to a creation device, but those are not under the control of the user.

    Of course this would require people to actually fact check before assuming truth. We know how well that happens, but the easier it is to fact check, the better.

    • (Score: 2) by JoeMerchant on Thursday December 14 2017, @10:57PM

      by JoeMerchant (3937) on Thursday December 14 2017, @10:57PM (#609955)

      must be crypto signed to be considered real

      Nice concept, but any such thing can be forged, and most people don't even come close to understanding the likelihood or ease with which that can happen.

      --
      🌻🌻 [google.com]
    • (Score: 2) by c0lo on Thursday December 14 2017, @11:27PM

      by c0lo (156) Subscriber Badge on Thursday December 14 2017, @11:27PM (#609978) Journal

      But we should be able to optionally have cryptographically signed media(signed at creation time).

      With a backdoored algorithm, right?
      Cause other algos may constitute a crime if exercised by lowly citizens.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 2) by Nuke on Thursday December 14 2017, @09:01PM (1 child)

    by Nuke (3162) on Thursday December 14 2017, @09:01PM (#609868)

    Whether an image, or a stream of images in a video, there's no physical harm that takes place when photons hit a person's retina in any given pattern... it won't be long at all before such things can be forged, more or less effortlessly. .... I think we need to readjust the attitude that images represent truth - because it won't be long before false images are just as easy to make as true ones.

    Sounds fine and dandy, but there is plenty of harm that can take place that is not "physical", like losing your job or causing a split with your partner.

    What you say about being easy to forge etc has been the case with speech and writing ever since they were invented - ie the ability to tell lies. The fact that people are familiar with the existence of lies (just like you say they will become familiar with false videos) does not stop them from believing them often. In fact the ability to falsify "an image, or a stream of images in a video" (as you say) has been around for years; false porn has been used in the past to embarrass or bring down politicians etc by using look-alike actors for example, and entire "historical" films used to falsify, sanitise or demonise past events in the public perception.

    • (Score: 3, Insightful) by JoeMerchant on Thursday December 14 2017, @11:01PM

      by JoeMerchant (3937) on Thursday December 14 2017, @11:01PM (#609958)

      there is plenty of harm that can take place that is not "physical", like losing your job or causing a split with your partner.

      This is where society needs to grow up, because these things can and do still happen. The problem is when people believe that a false image is true. The person creating the false image is lying, just the same as speaking false statements - there's really no distinction except that for the moment only a few people are capable of lying this way, that's going to change soon to where anybody can create a "lying" video as easily as speaking a false story - and that's what society needs to get its head around.

      False images have been made since before the camera, but this recent era of mechanical reproduction has led people to the erroneous assumption that "pictures don't lie." The can, and they will be lying a lot more easily in the future.

      --
      🌻🌻 [google.com]
  • (Score: 2) by acid andy on Thursday December 14 2017, @10:02PM (1 child)

    by acid andy (1683) on Thursday December 14 2017, @10:02PM (#609908) Homepage Journal

    On a purely intellectual level, I agree with you, but I hasten to add that I recognize the practical flaws in that theory.

    A similar line of thinking did help get me through a very shitty job though and can work for other people: provided you're in a job that involves intellectual work rather than physical labor and your co-workers aren't insane enough to physically attack you, everything that happens in the job, no matter how shitty, is simply various benign configurations of photons striking your retina and sound waves vibrating your inner ear and you might find yourself tapping repeatedly tapping some small rectangles of plastic for a few hours. It's a great way to abstract away stress! Only works up to a point sadly, unless you're a Vulcan!

    --
    If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
    • (Score: 2) by JoeMerchant on Thursday December 14 2017, @11:03PM

      by JoeMerchant (3937) on Thursday December 14 2017, @11:03PM (#609961)

      Yeah, I worked for a crappy boss who was all abluster about firing everybody every so often. It was clearly smoke, but he was also clearly insecure about the financial health of his company - a great sign that it's time to leave, and a real source of stress: if this jerk is that worried, then I may actually be out of a job, not because I performed poorly, but because he did.

      --
      🌻🌻 [google.com]
  • (Score: 4, Insightful) by DannyB on Thursday December 14 2017, @10:54PM (1 child)

    by DannyB (5839) Subscriber Badge on Thursday December 14 2017, @10:54PM (#609950) Journal

    there's no physical harm that takes place when photons hit a person's retina

    I think the harm occurs later as the images are processed and affects the person's mind.

    As an example. About a decade ago, there was a video going around of some guy brutally mutilating his genitals. There were reaction videos to this on YouTube, showing the person's reaction, but not the video they were reacting to. I wondered what the fuss was about. I have to admit I had a bit of a temptation to watch it. But I decided that there are just some things that you cannot un-see. And I would rather not see it nor have it in my memory forever and ever.

    --
    The lower I set my standards the more accomplishments I have.
    • (Score: 2) by JoeMerchant on Thursday December 14 2017, @11:11PM

      by JoeMerchant (3937) on Thursday December 14 2017, @11:11PM (#609964)

      There's a certain happiness when the images and stories of Legendary Figures (such as Santa Clause, the Easter Bunny, Iron Man, etc.) appear real to you, and that's lost when you realize the images are false.

      There's a certain queasiness when watching real images of terrible things taking place, well founded queasiness and the images can be disturbing for a long time. Even though I know it was fake, there was a scene in a gangster movie with a guy's head in a vice... that stuck with me in a bad way and would have been even more powerful if it was real.

      The point here is that people need to know that fantasy images, and forgeries, aren't real - and deal with them as such. After 10 years or so, now I only get revolted by the thought of the vice if I let myself go there.

      If "copycat" actors are an issue, then all of Hollywood and other movie studios around the world need to be shut down, because they imaging and portray such a vast array of horrible things - giving people ideas... just like writing, maybe need to censor the written word too?

      --
      🌻🌻 [google.com]
  • (Score: 2) by Mykl on Thursday December 14 2017, @11:33PM (1 child)

    by Mykl (1112) on Thursday December 14 2017, @11:33PM (#609983)

    there's no physical harm that takes place when photons hit a person's retina in any given pattern

    I can't disagree with this enough. You are effectively saying "nothing you see can ever harm you physically". Victims of PTSD would disagree with you.

    In any case, viewing of certain images absolutely can harm people mentally. That's why we have film rating boards etc.

    • (Score: 2) by JoeMerchant on Friday December 15 2017, @04:25AM

      by JoeMerchant (3937) on Friday December 15 2017, @04:25AM (#610094)

      PTSD is very real, because comes from a very real experience. Nobody is claiming PTSD from watching "Saving Private Ryan," (without having experienced the war themselves) or playing video games, but the drone pilots in Nellis are getting it because they are doing real damage on the other side of the world with their video linked systems. The reality of it matters, and the sooner people clue into the fact that "convincing effects" are not reality, the sooner people can stop suffering trauma from fake images.

      As for film rating boards - the first image that comes to my mind is from "The Aviator" with Howard Hughes defending his footage of Jane Russel's cleavage as not being any less prurient than existing approved footage. The film board has to think of the children, or whatever, right? I agree that children should be sheltered from images of sex and violence until they can get some understanding of what they are seeing - but I think parents are far more traumatized than their children when their children see depictions of sex, and apparently the film boards in the USA think that imagery bloodier and more graphic that a real tour in Vietnam is suitable for children 13 and over.

      --
      🌻🌻 [google.com]
  • (Score: 2) by FatPhil on Friday December 15 2017, @09:56AM (1 child)

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Friday December 15 2017, @09:56AM (#610216) Homepage
    > it won't be long before false images are just as easy to make as true ones.

    You've missed the boat on that one by about 100 years: https://en.wikipedia.org/wiki/Cottingley_Fairies
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 2) by JoeMerchant on Friday December 15 2017, @12:55PM

      by JoeMerchant (3937) on Friday December 15 2017, @12:55PM (#610266)

      Well, even today, it's a rare skill to be able to make an undetectable forgery.

      It won't be long before there's an app for that.

      --
      🌻🌻 [google.com]