Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday January 26 2018, @10:39PM   Printer-friendly
from the porn-driving-innovation dept.

Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence.

Back in December, the unsavory hobby of a Reddit user by the name of deepfakes became a new centerpiece of artificial intelligence debate, specifically around the newfound ability to face-swap celebrities and porn stars. Using software, deepfakes was able to take the face of famous actresses and swap them with those of porn actresses, letting him live out a fantasy of watching famous people have sex. Now, just two months later, easy-to-use applications have sprouted up with the ability to perform this real-time editing with even more ease, according to Motherboard, which also first reported about deepfakes late last year.

Thanks to AI training techniques like machine learning, scores of photographs can be fed into an algorithm that creates convincing human masks to replace the faces of anyone on video, all by using lookalike data and letting the software train itself to improve over time. In this case, users are putting famous actresses into existing adult films. According to deepfakes, this required some extensive computer science know-how. But Motherboard reports that one user in the burgeoning community of pornographic celebrity face swapping has created a user-friendly app that basically anyone can use.

The same technique can be used for non-pornographic purposes, such as inserting Nicolas Cage's face into classic movies. One user also "outperformed" the Princess Leia scene at the end of Disney's Rogue One (you be the judge, original footage is at the top of the GIF).

The machines are learning.


Original Submission

Related Stories

Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn" 27 comments

The messaging platform Discord has taken down a channel that was being used to share and spread AI-edited pornographic videos:

Last year, a Reddit user known as "deepfakes" used machine learning to digitally edit the faces of celebrities into pornographic videos, and a new app has made the process much easier to create and spread the videos online. on Friday, chat service Discord shut down a user-created group that was spreading the videos, citing their policy against revenge porn.

Discord is a free chat platform that caters to gamers, and has a poor track record when it comes to dealing with abuse and toxic communities. After it was contacted by Business Insider, the company took down the chat group, named "deepfakes."

Discord is a Skype/TeamSpeak/Slack alternative. Here are some /r/deepfakes discussions about the Discord problem.

One take is that there is no recourse for "victims" of AI-generated porn, at least in the U.S.:

People Can Put Your Face on Porn—and the Law Can't Help You

To many vulnerable people on the internet, especially women, this looks a whole lot like the end times. "I share your sense of doom," Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, and also serves as the tech and legislative policy advisor for the Cyber Civil Rights Initiative. "I think it is going to be that bad."

Merkel Trump Deepfake

Previously: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit


Original Submission

Pornhub and Reddit Purge AI-Generated "Involuntary Pornography" (Updated) 50 comments

The AI porn purge continues:

Pornhub will be deleting "deepfakes" — AI-generated videos that realistically edit new faces onto pornographic actors — under its rules against nonconsensual porn, following in the footsteps of platforms like Discord and Gfycat. "We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it," the company told Motherboard, which first reported on the deepfakes porn phenomenon last year. Pornhub says that nonconsensual content includes "revenge porn, deepfakes, or anything published without a person's consent or permission."

Update: The infamous subreddit itself, /r/deepfakes, has been banned by Reddit. /r/CelebFakes and /r/CelebrityFakes have also been banned for their non-AI porn fakery (they had existed for over 7 years). Other subreddits like /r/fakeapp (technical support for the software) and /r/SFWdeepfakes remain intact. Reported at Motherboard, The Verge, and TechCrunch.

Motherboard also reported on some users (primarily on a new subreddit, /r/deepfakeservice) offering to accept commissions to create deepfakes porn. This is seen as more likely to result in a lawsuit:

My Struggle With Deepfakes 14 comments

There has been some controversy over Deepfakes, a process of substituting faces in video. Almost immediately, it was used for pornography. While celebrities were generally unamused, porn stars were alarmed by the further commodification of their rôle. The algorithm is widely available and several web sites removed objectionable examples. You know something is controversial when porn sites remove it. Reddit was central for Deepfakes/FakeApp tech support and took drastic action to remove discussion after it started to become synonymous with fictitious revenge porn and other variants of anti-social practices.

I found a good description of the deepfakes algorithm. It runs via a standard neural network library but requires considerable processing power on specific GPUs. I will describe the video input (with face to be removed) as the source and the face to be replaced as the target. The neural network is trained with the target face only. The source is distorted and the neural network is trained to approximate reference images of the target. When the neural network is given the source, it has been trained to "undistort" the source to target.

[Continues...]

Deep Fakes Advance to Only Needing a Single Two Dimensional Photograph 4 comments

Currently to get a realistic Deep Fake, shots from multiple angles are needed. Russian researchers have now taken this a step further, generating realistic video sequences based off a single photo.

Researchers trained the algorithm to understand facial features' general shapes and how they behave relative to each other, and then to apply that information to still images. The result was a realistic video sequence of new facial expressions from a single frame.

As a demonstration, they provide details and synthesized video sequences of historical figures such as Albert Einstein and Salvador Dali, as well as sequences based on paintings such as the Mona Lisa.

The authors are aware of the potential downsides of their technology and address this:

We realize that our technology can have a negative use for the so-called "deepfake" videos. However, it is important to realize, that Hollywood has been making fake videos (aka "special effects") for a century, and deep networks with similar capabilities have been available for the past several years (see links in the paper). Our work (and quite a few parallel works) will lead to the democratization of the certain special effects technologies. And the democratization of the technologies has always had negative effects. Democratizing sound editing tools lead to the rise of pranksters and fake audios, democratizing video recording lead to the appearance of footage taken without consent. In each of the past cases, the net effect of democratization on the World has been positive, and mechanisms for stemming the negative effects have been developed. We believe that the case of neural avatar technology will be no different. Our belief is supported by the ongoing development of tools for fake video detection and face spoof detection alongside with the ongoing shift for privacy and data security in major IT companies.

While it works with as few as one frame to learn from, the technology benefits in accuracy and 'identity preservation' from having multiple frames available. This becomes obvious when observing the synthesized Mona Lisa sequences, which, while accurate to the original, appear to be essentially three different individuals to the human eye watching them.

Journal Reference: https://arxiv.org/abs/1905.08233v1

Related Coverage
Most Deepfake Videos Have One Glaring Flaw: A Lack of Blinking
My Struggle With Deepfakes
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"
AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
As Fake Videos Become More Realistic, Seeing Shouldn't Always be Believing


Original Submission

GitHub Censors "Sexually Obscene" DeepNude Code 102 comments

Github is banning copies of 'deepfakes' porn app DeepNude

GitHub is banning code from DeepNude, the app that used AI to create fake nude pictures of women. Motherboard, which first reported on DeepNude last month, confirmed that the Microsoft-owned software development platform won't allow DeepNude projects. GitHub told Motherboard that the code violated its rules against "sexually obscene content," and it's removed multiple repositories, including one that was officially run by DeepNude's creator.

DeepNude was originally a paid app that created nonconsensual nude pictures of women using technology similar to AI "deepfakes." The development team shut it down after Motherboard's report, saying that "the probability that people will misuse it is too high." However, as we noted last week, copies of the app were still accessible online — including on GitHub.

Late that week, the DeepNude team followed suit by uploading the core algorithm (but not the actual app interface) to the platform. "The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code," wrote the team on a now-deleted page. "DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects."

Also at The Register, Vice, and Fossbytes.

Previously: "Deep Nude" App Removed By Developers After Brouhaha

Related: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"
My Struggle With Deepfakes
Deep Fakes Advance to Only Needing a Single Two Dimensional Photograph


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Interesting) by takyon on Friday January 26 2018, @10:39PM (2 children)

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Friday January 26 2018, @10:39PM (#628597) Journal

    This is the NSFW subreddit: /r/deepfakes [reddit.com]

    FakeApp [reddit.com] only works on Windows for now, with NVIDIA GPUs (uses Tensorflow), and most are using GTX 980, 1070 or better and no less than 4 GB of VRAM. It seems it can gobble up 16 GB of RAM easily, but it depends on the size of the training data and other factors.

    The coder [reddit.com] is looking at efficiency improvements to help users with only 2 GB of VRAM, and will move the code to a repository after it is cleaned up and some remaining features are added.

    Trump in place of Chris Hemsworth in a Thor Ragnarok clip. [youtube.com]

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: -1, Spam) by Anonymous Coward on Friday January 26 2018, @11:11PM (1 child)

      by Anonymous Coward on Friday January 26 2018, @11:11PM (#628609)

      He was a relatively unknown magician who finally stumbled upon the opportunity to perform in front of a large audience. He did not intend to waste this chance, so he decided to pull out all the stops...

      The obese magician pointed at a little boy in the audience to signify that he would be the volunteer for this trick. The boy's parents encouraged him to get up on stage, and so he did. There were no props or tools to be seen, so the audience began wondering what sort of trick would be performed. Then, the red curtains closed, thereby concealing the boy and the magician. 'Just what kind of trick is this?' the audience thought to themselves.

      Soon after the curtains closed, the audience heard the man telling the boy to take off his clothes. This was followed by the boy sobbing and telling the magician that he didn't want to. Next, the man screamed at the boy to follow his orders. It wasn't long before the audience could hear slapping sounds, as though flesh was colliding with flesh repeatedly. In addition, the sound of a man moaning could be heard. Several minutes later, the curtains began to open...

      What the audience witnessed next was beyond their ability to comprehend. Was it really a trick, or was it true magic!? Marvelous! The audience saw the boy naked and lying in a pool of his own blood, utterly motionless. The magician was sweating heavily and looked like he was in a state of euphoria. The magician then said, "Tadaaaaaaaaaaaaaaaaaaaaaaa!" That magician had somehow managed to turn the boy's motion into silence, a fact which left the audience utterly speechless. Then, every single person who bore witness to this magnificent event - including the boy's parents - gave the performer a standing ovation. Someone shouted, "How did you do it!? I need to know how this amazing trick was performed!" The magician smiled and said simply, "A magician doesn't reveal his tricks." Everyone laughed.

      • (Score: 0) by Anonymous Coward on Friday January 26 2018, @11:52PM

        by Anonymous Coward on Friday January 26 2018, @11:52PM (#628634)

        Y u no haz account assteller

  • (Score: 0) by Anonymous Coward on Friday January 26 2018, @10:42PM (6 children)

    by Anonymous Coward on Friday January 26 2018, @10:42PM (#628599)

    This post on reddit is blowing up Fake celeb [reddit.com] *somewhat SFW*

    They were too busy wondering if they could that they never stopped to think about whether they should.

    • (Score: 2) by bob_super on Saturday January 27 2018, @12:42AM (2 children)

      by bob_super (1357) on Saturday January 27 2018, @12:42AM (#628660)

      DIY_Celeb_Porn: Like we really needed another reason for people to snatch GPUs and make prices insane...

      • (Score: 0) by Anonymous Coward on Saturday January 27 2018, @01:30AM

        by Anonymous Coward on Saturday January 27 2018, @01:30AM (#628669)

        Yeah, that video probably required at least a dozen geforce 1080s!

      • (Score: 0) by Anonymous Coward on Saturday January 27 2018, @04:00PM

        by Anonymous Coward on Saturday January 27 2018, @04:00PM (#628910)

        or get them licensed....

    • (Score: 2) by FatPhil on Saturday January 27 2018, @04:22AM (2 children)

      by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Saturday January 27 2018, @04:22AM (#628710) Homepage
      But lookalike porn has existed for ages. This is no different. The only thing this tells me is *stop trustin video evidence from untrusted sources*.

      But try telling that to a population that believed Saddam had access to weapons od mass destruction.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 2) by Runaway1956 on Friday January 26 2018, @10:59PM (5 children)

    by Runaway1956 (2926) Subscriber Badge on Friday January 26 2018, @10:59PM (#628605) Journal

    Young man put the face of Lee Harvey Oswald on his own image. His own grandmother swears that it's her grandson's photograph. His parents know better, so they won't swear to it, but he says that he has fooled them in the past.

    I can't say how easy it might be to make convincing fakes, but obviously, it's not terribly difficult.

  • (Score: 1, Interesting) by Anonymous Coward on Friday January 26 2018, @11:22PM (3 children)

    by Anonymous Coward on Friday January 26 2018, @11:22PM (#628615)

    Once again, porn leads the tech bleeding edge. :P

    What can't porn do? Someone organize pay-per-view deathmatch between porn and global warming.

    • (Score: 2) by MichaelDavidCrawford on Friday January 26 2018, @11:41PM (2 children)

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Friday January 26 2018, @11:41PM (#628626) Homepage Journal

      Shortly after Blu-ray's introduction to the market, someone in the Blu-ray business met with a bunch of pr0n industry technical people to explain how Blu-ray should be used by them

      I Am Absolutely Serious.

      --
      Yes I Have No Bananas. [gofundme.com]
      • (Score: 2) by krishnoid on Friday January 26 2018, @11:54PM

        by krishnoid (1156) on Friday January 26 2018, @11:54PM (#628637)

        Makes sense -- seems like they tapped on the wisdom of the Beta/VHS parable.

      • (Score: 3, Interesting) by bob_super on Saturday January 27 2018, @12:11AM

        by bob_super (1357) on Saturday January 27 2018, @12:11AM (#628646)

        Shortly after Blu-ray's introduction to the market, someone in the Blu-ray business met with a bunch of pr0n industry technical people to explain how Blu-ray should be used by them

        I Am Absolutely Serious.

        If you don't realize that the porn industry was following Blu-Ray (and HDDVD) from the first draft of the advance specifications, you're even more off your meds than usual.
        DVDs have a "camera angle" feature, which wasn't requested by Spielberg ...

  • (Score: 2) by bob_super on Friday January 26 2018, @11:23PM (2 children)

    by bob_super (1357) on Friday January 26 2018, @11:23PM (#628619)

    I totally banged Jessica Alba and Natalie Portman !
    Jealous people are always trying to find ways to dismiss my evidence.

  • (Score: 4, Informative) by buswolley on Friday January 26 2018, @11:27PM (9 children)

    by buswolley (848) on Friday January 26 2018, @11:27PM (#628620)

    They're not hard to detect, although some are more convincing than others. Check the Leia remake from the Force Awakens. It competes with the special effects crew's attempt pretty well.

    However, in the future, this is going to be a big problem. Video evidence is now weak, if there is ever a hint of conspiracy against someone. Politicians will use it. Media desperate for a story may use it. Russia will use it.

    --
    subicular junctures
    • (Score: 2) by MichaelDavidCrawford on Friday January 26 2018, @11:45PM (5 children)

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Friday January 26 2018, @11:45PM (#628628) Homepage Journal

      I swear I'm not making this up:

      I don't remember much about the episode but the 5-0 people had a photo whose authenticity was called into question.

      They consulted an image processing expert who advised them to enlarge the photo enough that the pixels could be seen clearly.

      If the photo were shopped there would be a discontinuity at the edges of the bits that were pasted in.

      Now of course it won't be so easy to spot today's fakes, because one can blend adjacent edges together to make it less obvious there is a seam. But I expect the principle still applies.

      --
      Yes I Have No Bananas. [gofundme.com]
      • (Score: 5, Insightful) by takyon on Friday January 26 2018, @11:56PM (3 children)

        by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Friday January 26 2018, @11:56PM (#628639) Journal

        But I expect

        I expect that the machine learning techniques will become better with more revisions, greater access to curated training data, and better hardware. Eventually, the fakes could be made in near real-time.

        In real life, you don't need to convince experts with your video, you just need to convince Facebook users who unironically share or get enraged by fake news stories.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 1, Insightful) by Anonymous Coward on Saturday January 27 2018, @01:32AM

          by Anonymous Coward on Saturday January 27 2018, @01:32AM (#628671)

          Don't forget the confirmation bias that will occur when "they" say it is fake which will only make people think it is a cover up.

        • (Score: 4, Interesting) by edIII on Saturday January 27 2018, @03:49AM

          by edIII (791) on Saturday January 27 2018, @03:49AM (#628695)

          What's so funny is that his theory has worked from the beginning. Back in the 80's they have the same stuff, except it was like, far more hilarious. Like maybe an order of magnitude better than pasting a magazine cutout on the monitor, but not much better. Then the lines started being blurred on the neck at least, and from there it was iterative improvements to the point we need extensive AI analysis to identify a fraud.

          I honestly wonder if will get to the point where we dismiss visual and audio evidence as something akin to hearsay, while giving weight only to that which has been verified. Even then, who is to say that the cryptographically signed surveillance video wasn't doctored at it's inputs and is passing a transcoded stream? Security cameras will have to be tamper-proof and audited to be legally valid. Given our hilarious and almost scary lack of security right now, how could we ever say conclusively, the shit wasn't modified? It's going to be degree of confidences at best.

          Didn't some researcher crack something about being able create any protein, and they're working on a way of scaling the process? How much longer till you can replicate DNA and evidence to be placed somewhere?

          With plastic surgery getting better day by day, you might be fucking this celebrity and still be wondering if she isn't a fembot quietly stealing your cryptocurrency.

          You just can't trust shit about shit about shit :)

          --
          Technically, lunchtime is at any moment. It's just a wave function.
        • (Score: 0) by Anonymous Coward on Saturday January 27 2018, @12:50PM

          by Anonymous Coward on Saturday January 27 2018, @12:50PM (#628836)

          And BOOM orange clown is the president of the United States. Putin will be laughing all the way to the bank.

      • (Score: 4, Insightful) by Anonymous Coward on Saturday January 27 2018, @01:04AM

        by Anonymous Coward on Saturday January 27 2018, @01:04AM (#628666)

        If there's an algorithm/method to detect a fake, there's input to a training algorithm to ensure it doesn't trigger that algorithm.

    • (Score: 5, Interesting) by takyon on Friday January 26 2018, @11:48PM

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Friday January 26 2018, @11:48PM (#628631) Journal

      GPU hardware is improving quite well, custom machine learning hardware such as TPUs could improve on that further, and there's talk of quantum machine learning or neuromorphic computing. 6-8 GB of VRAM is not uncommon, and memory bandwidth could increase a lot with GDDR6 and HBM. The quantity of RAM that home users can afford will probably double soon. We may have unified storage and memory in the future (current attempts like XPoint are weak).

      The users on the subreddit report having to match up angles, sizes, and such in the training data to get good results. Improvements to the software could reduce the amount of manual fiddling needed. Machine learning could be used on the selection of the training data itself.

      In the near term, instead of matching existing videos with the desired faces, you could film using a (small-time) actor with some similarity to the replacement face/body. You can do everything you need to do to ensure good training data and a convincing scene, and create your "Assange dies escaping the Ecuadorian embassy" video with controlled conditions.

      https://www.theverge.com/2017/7/12/15957844/ai-fake-video-audio-speech-obama [theverge.com]
      https://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/ai-creates-fake-obama [ieee.org]

      Don't forget the audio:

      https://www.theverge.com/2017/4/24/15406882/ai-voice-synthesis-copy-human-speech-lyrebird [theverge.com]

      Vocaloid has been around for years, but now there's intense research efforts by Amazon, Google, Facebook, Baidu, Apple, Samsung, etc. as they are all pitching voice assistants and want to make them sound more realistic.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by wonkey_monkey on Saturday January 27 2018, @05:58PM

      by wonkey_monkey (279) on Saturday January 27 2018, @05:58PM (#628998) Homepage

      Check the Leia remake from the Force Awakens. It competes with the special effects crew's attempt pretty well.

      Eh, barely. That video is horribly low resolution, and it certainly doesn't - as the article claims - "outperforms Disney."

      --
      systemd is Roko's Basilisk
    • (Score: 2) by Joe Desertrat on Sunday January 28 2018, @12:12AM

      by Joe Desertrat (2454) on Sunday January 28 2018, @12:12AM (#629231)

      However, in the future, this is going to be a big problem. Video evidence is now weak, if there is ever a hint of conspiracy against someone. Politicians will use it. Media desperate for a story may use it. Russia will use it.

      People already believe and repost the most outlandish "evidence" on social media. It won't take much more advancement in video editing to convince them.

  • (Score: 2) by SomeGuy on Saturday January 27 2018, @12:37AM (3 children)

    by SomeGuy (5632) on Saturday January 27 2018, @12:37AM (#628657)

    So can anyone find some pictures of a younger Jabba the Hutt, so we can re-do the added scene in the "updated" A New Hope? :P

    Seriously, my TI-99/4a could draw a more realistic looking Jabba.

  • (Score: 0) by Anonymous Coward on Saturday January 27 2018, @02:57AM (1 child)

    by Anonymous Coward on Saturday January 27 2018, @02:57AM (#628679)

    Huma Mahmood Abedin with Hillary Diane Rodham Clinton

    William Jefferson Clinton with Monica Samille Lewinsky

    Vladimir Vladimirovich Putin with his horse

    Kim Jong-un with a chicken

    • (Score: 1, Funny) by Anonymous Coward on Saturday January 27 2018, @03:27AM

      by Anonymous Coward on Saturday January 27 2018, @03:27AM (#628687)

      RMS with a gnu.

  • (Score: 2) by maxwell demon on Saturday January 27 2018, @07:51AM (2 children)

    by maxwell demon (1608) on Saturday January 27 2018, @07:51AM (#628751) Journal

    Now you don't even need to have nude images of your former partner in order to do revenge porn. Just put his/her face on another nude image.

    Another possible use is to provide fake evidence. Take a CCTV recording, replace the face of some random person appearing on it with the suspect's face, and you've got "proof" that the suspect was at that place.

    --
    The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 0) by Anonymous Coward on Saturday January 27 2018, @04:19PM

      by Anonymous Coward on Saturday January 27 2018, @04:19PM (#628926)

      Cpt. Obvious strikes again. Obviously.

    • (Score: 0) by Anonymous Coward on Sunday January 28 2018, @01:47PM

      by Anonymous Coward on Sunday January 28 2018, @01:47PM (#629435)

      Now you don't even need to have nude images of your former partner in order to do revenge porn.

      And the fake is just as illegal.

  • (Score: 2) by looorg on Saturday January 27 2018, @08:43AM (2 children)

    by looorg (578) on Saturday January 27 2018, @08:43AM (#628769)

    Does it somehow become better when you spank it to fake celeb pictures then to "normal" porn stars? Don't think I really got the celeb-fetish. I assume famous people fuck to, the enormous amount of "accidental nude selfies and videos" that get leaked seems to indicate that, but I just don't see them being more spank worthy then something else.

    I guess the bit of interesting thing here then would be if the program is now so easy to use or the results are indistinguishable from reality. Then we can't really trust anything anymore in the digital world. Which might be both scarey and liberating.

    • (Score: 2) by takyon on Saturday January 27 2018, @09:56AM

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday January 27 2018, @09:56AM (#628784) Journal

      Some faces just look better than others. There may be a humiliation/revenge aspect too, such as getting back at actresses and actors who have opinions you don't like. Others may not care about the celebs much but want to elicit any kind of reaction from them. Now that this has hit a few news sites, it's only a matter of time before a celeb gives their [amused|outraged|tearful] take on the fake porn they have been edited into.

      Some have an academic interest: "Can it be done? How well can it be done?" Some want to usher in an age where all audio/video evidence is seen as potentially fake. That kind of undermines the explosion of CCTV, dashcams, etc., especially when so much stuff is hackable (plausible to swap a real video in storage for a fake).

      Some want to see if the U.S. California will try to ban this practice, setting up a 1st Amendment legal battle.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by takyon on Saturday January 27 2018, @11:30AM

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday January 27 2018, @11:30AM (#628803) Journal

      Another thing: while this could be used for revenge porn and humiliation against individuals who aren't famous, it is easier to use celebrities, politicians, etc. because they have put themselves out there in the public sphere so much, generating plenty of high-quality images and videos to be used for training data.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by wonkey_monkey on Saturday January 27 2018, @06:01PM (2 children)

    by wonkey_monkey (279) on Saturday January 27 2018, @06:01PM (#629000) Homepage

    One user also "outperformed" the Princess Leia scene at the end of Disney's Rogue One

    Not even close to "outperforming." That's a massive overstatement for the result (which can be seen in glorious 360p video, wow).

    --
    systemd is Roko's Basilisk
    • (Score: 2, Disagree) by takyon on Saturday January 27 2018, @09:37PM (1 child)

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday January 27 2018, @09:37PM (#629160) Journal

      The video/GIF resolution has nothing to do with it.

      As for your opinion, it can either be treated as expert testimony or tossed out since you have admitted to being a hardcore Star Wars fan editor.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by wonkey_monkey on Sunday January 28 2018, @12:29AM

        by wonkey_monkey (279) on Sunday January 28 2018, @12:29AM (#629236) Homepage

        Resolution has plenty to do with it. We can't see how well the other guy's method really replicated skin texture details and lighting (particularly subtleties like subsurface scattering) because all of those details - which were present in the film - are gone, and were probably never present in his version

        Here, I rendered my own version which outperforms Disney. You'll need to scale the original film down to 1px x 1px to compare them though:

        https://upload.wikimedia.org/wikipedia/commons/c/ca/1x1.png [wikimedia.org]

        As for your opinion, it can either be treated as expert testimony or tossed out since you have admitted to being a hardcore Star Wars fan editor.

        Why would that be relevant either way?

        Also I'm not sure there are any casual Star Wars fan editors...

        --
        systemd is Roko's Basilisk
(1)