Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Thursday February 08 2018, @02:07AM   Printer-friendly
from the porn-with-morals dept.

The AI porn purge continues:

Pornhub will be deleting "deepfakes" — AI-generated videos that realistically edit new faces onto pornographic actors — under its rules against nonconsensual porn, following in the footsteps of platforms like Discord and Gfycat. "We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it," the company told Motherboard, which first reported on the deepfakes porn phenomenon last year. Pornhub says that nonconsensual content includes "revenge porn, deepfakes, or anything published without a person's consent or permission."

Update: The infamous subreddit itself, /r/deepfakes, has been banned by Reddit. /r/CelebFakes and /r/CelebrityFakes have also been banned for their non-AI porn fakery (they had existed for over 7 years). Other subreddits like /r/fakeapp (technical support for the software) and /r/SFWdeepfakes remain intact. Reported at Motherboard, The Verge, and TechCrunch.

Motherboard also reported on some users (primarily on a new subreddit, /r/deepfakeservice) offering to accept commissions to create deepfakes porn. This is seen as more likely to result in a lawsuit:

Bringing commercial use into the deepfakes practice opens the creator up to a lawsuit on the basis of right of publicity laws, which describe the right of an individual to control the commercial use of their name, likeness, or any other unequivocal aspect of their identity, legal experts told me.

"The videos are probably wrongful under the law whether or not money is exchanged," Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute think tank, told me. "But what's important is that the commercial exchange creates a focal point for tracing and hopefully stopping this activity. It might be easy to be anonymous on the internet, but it's a lot harder when you want to be paid."

[...] David Greene, Civil Liberties Director at the Electronic Freedom Foundation, told me on the phone that buying and selling, like everything with deepfakes, may be clearly unsavory behavior, but not necessarily illegal. "I want to separate something that's probably a dumb legal idea from something that's just a socially bad thing to do," Greene said. "If you're doing it to harass somebody, it's certainly a bad idea legally and socially."

Update: However, /r/deepfakeservice has also been hit with the banhammer. Looks like "deepfakes" will soon become "darkwebfakes".

Previously: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"

Related: Linux Use on Pornhub Surged 14% in 2016
Pornhub's Newest Videos Can Reach Out and Touch You
Pornhub Adopts Machine Learning to Tag Videos as Malvertising Looms
Pornhub's First Store has a Livestreaming Bed Camera, of Course


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by takyon on Thursday February 08 2018, @03:36AM (10 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @03:36AM (#634660) Journal

    The centralized web giants have gone from censoring illegal content to censoring content that makes people feel bad. The implications of widespread fakery don't matter to them.

    "Involuntary pornography" is voluntary. One party volunteered to get naked on camera, and another party volunteered to make their face known by publishing photos or videos of it, or walking around in public (a bigger concern with celebrities, but everyone is on camera now). Finally, some random person on the web volunteered to mash the two together.

    A lot of the appeal here seems to be the desire for celebrities who are "hard to get" (don't appear nude). But machine learning will also lead to the creation of "virtual actors" who mix and match traits from so many people that they can't be tied to any specific one. Or they can just move the faux celeb nudes off of Giphy/Reddit/Cornhub/etc. and onto decentralized platforms or overseas ones that don't care.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Moderation   +2  
       Insightful=3, Overrated=1, Total=4
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 1, Insightful) by Anonymous Coward on Thursday February 08 2018, @05:00AM (5 children)

    by Anonymous Coward on Thursday February 08 2018, @05:00AM (#634681)

    Sigh, really. No, it's not voluntary, somebody's face is now pasted on somebody else's body making it appear that they were engage in acts that they weren't engaged in. For many celebrities, their ability to earn money is based in part on the image that they've cultivated.

    For people that aren't celebrities, this is a creepy tool that can be used to create porn of somebody that didn't actually consent to be involved in porn. This is sort of like when a director uses a body double in a move to make it appear as though the actor was naked. Common decency alone ought to be sufficient to tell people, that it's not OK to create porn that appears to depict somebody that didn't consent to being in porn.

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @06:17AM

      by Anonymous Coward on Thursday February 08 2018, @06:17AM (#634723)

      Another critically threatened faction is the porn actor. With this evolving technology, a virtual perfect ( for each individual's taste, no less! ) porn actor will perform any stunt - even stunts not physically possible given human structure.

      A real "wonder woman", actually bending large pipes, using humongous pectoralis major muscles, which grow and shrink according to the state of her energy levels... biceps the shape and size of baseballs erupt on tension, calf muscles tightly bunching up pulling their achilles tendon tight as tow ropes. Yet all melting back to voluptious womanhood when relaxed...

      No real human is capable of doing such illusions, short of Hollywood magic.. but apparently anything anyone can even imagine, every erotic fantasy, will be imaged. Morphs to bestiality... child porn of completely synthesized features... anything anyone can dream up!

      Would any existing human mate physically measure up?

      It was bad enough having advertising agencies set the bar for what people should look like. Even today, the instant I turn on the TV, there is some advertiser selling the treatment for something that he says others think is ugly. I stink. I have stubble. Bags under my eyes! A roll of fat here or there. Too fat. Too thin. Not in fashion. Not seen in the right place. No-ones gonna want me, so I better buy now while their stuff is still available. We'll double the offer, just pay separate fee.

      Just wait for the "joystick" and "joystick receptacle" and other lifelike sex toys/dolls to make the scene... bluetooth enabled for sync to the VR headset.

      I guess the government will have to start paying people to procreate the way its been done since life came into existence on this planet - as the "real" thing won't be nearly as much fun as playing around with the "imaginary" thing.... not only that, you will end up with kids to raise!

      Yes, I know this whole thing looks way too much like those spam posts that have been showing up here.

    • (Score: 3, Insightful) by takyon on Thursday February 08 2018, @06:45AM (2 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @06:45AM (#634742) Journal

      You talk about common decency, but how about legality? Personality rights [wikipedia.org] seem to apply only to the use of someone's image for commercial purposes. That might be applicable to faceswap porn if there is ad revenue or a commission involved. Absent that, if it's just a labor of "love", then it doesn't seem to violate any laws or infringe on "personality rights" (which could be wiped out by the Supreme Court).

      Common decency alone is not sufficient to suppress activities that are legal. The person running their GTX 1080 GPU hot for 12 hours in order to swap a celeb's face onto a porn star's body has transcended the shackles of common decency. The best you can get them for is probably a copyright violation... of the porn producer's copyright, which could lead to a DMCA cat and mouse game. But it would likely go unnoticed if the porn company doesn't detect it, especially since the new work could be relabeled without the original name.

      If it isn't illegal to use someone's likeness, then the person contributes that likeness by venturing out into public and being a target of photography.

      Could it be illegal to train the AI using frames of someone's face ripped from a Hollywood movie? It could be very hard to reverse the process and prove where the imagery came from.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by DeathMonkey on Thursday February 08 2018, @06:02PM (1 child)

        by DeathMonkey (1380) on Thursday February 08 2018, @06:02PM (#635070) Journal

        This is from your own link:

        In the United States, the right of publicity is a state law-based right, as opposed to federal, and recognition of the right can vary from state to state.[30] The rationale underlying the right of publicity in the United States is rooted in both privacy and economic exploitation.[31] The rights are based in tort law, and the four causes of action are: 1) Intrusion upon physical solitude; 2) public disclosure of private facts; 3) depiction in a false light; and 4) appropriation of name and likeness.

        3 and 4 would both seem to apply.

        • (Score: 2) by takyon on Thursday February 08 2018, @06:24PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @06:24PM (#635089) Journal

          From the source [rightofpublicity.com] at the end of that paragraph:

          The Right of Publicity is likely to experience continued evolution in 2018, primarily from Right of Publicity legislation and case law (often involving fascinating fact patterns). One could also observe that the U.S. now has a President who entered office with more Right of Publicity licensing than any prior elected official. In short, it is evident that the Right of Publicity will continue generating vigorous debate. The Right of Publicity is often misunderstood, in part because recognition of the Right of Publicity varies significantly from state to state and country to country.

          The laws vary by state. What if someone who doesn't live in California or Indiana creates the porn? The laws also describe civil matters, not criminal. The federal Lanham Act deals with commercial appropriation of someone's likeness. The laws could be made obsolete by the Supreme Court in the coming years.

          I don't see a lot of recourse for the "victims". Suing users successfully will be difficult. And they may trigger a Streisand effect by trying to get their faked porn down.

          People Can Put Your Face on Porn—and the Law Can't Help You [wired.com]

          Franks helped write much of the US’s existing legislation that criminalizes nonconsensual porn—and it's not going to help. It’s not that Franks and lawmakers weren’t thinking about the implications of manipulated images. It’s that the premise of any current legislation is that nonconsensual porn is a privacy violation. Face-swap porn may be deeply, personally humiliating for the people whose likeness is used, but it's technically not a privacy issue. That's because, unlike a nude photo filched from the cloud, this kind of material is bogus. You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.

          And it's the very artifice involved in these videos that provides enormous legal cover for their creators. “It falls through the cracks because it’s all very betwixt and between,” says Danielle Citron, a law professor at the University of Maryland and the author of Hate Crimes in Cyberspace. “There are all sorts of First Amendment problems because it’s not their real body.” Since US privacy laws don’t apply, taking these videos down could be considered censorship—after all, this is “art” that redditors have crafted, even if it’s unseemly.

          [...] Does that mean that victims have zero hope of legal recourse? Not exactly. Celebrities will be able to sue for the misappropriation of their images. But that usually applies to commercial contexts—like, say, if someone took a social media photo of Gal Gadot’s and then used it to promote a strip club without her consent—and commercial speech doesn’t have nearly the protection individual citizens’ does.

          For the average citizen, your best hope is anti-defamation law. When Franks realized that revenge porn law wouldn't include language about false images, she recommended that lawmakers update their anti-defamation statutes to handle it—but in many cases, that hasn’t happened yet. And Franks thinks claimants will have difficulty proving that the creators intended to cause them emotional distress. So far, these videos do seem to have been created for the pleasure of the creator rather than the humiliation of the object of their desire. “Inevitably, someone will point out how many young men had posters of Princess Leia in their bedrooms as a masturbation fantasy,” Franks says. “Is the harm just that you found out about? Legally, we need to be able to articulate what is the harm, not just that it makes us feel icky.” And in such a fringe case as AI-enabled porn, that hasn’t happened yet.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @07:08PM

      by Anonymous Coward on Thursday February 08 2018, @07:08PM (#635128)

      lmao! well stealing people's face is at least assault with a deadly weapon so you know...

  • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @09:32AM (3 children)

    by Anonymous Coward on Thursday February 08 2018, @09:32AM (#634794)

    or walking around in public (a bigger concern with celebrities, but everyone is on camera now).

    Yeah, and all they have to do to avoid that is to become a shut-in or live as a hermit away from society. Three cheers for our mass surveillance society! This is such a good thing.

    • (Score: 2) by takyon on Thursday February 08 2018, @01:48PM (2 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @01:48PM (#634911) Journal

      That's about where we're at.

      Another option is to quietly accept it. Many Americans have.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @06:31PM (1 child)

        by Anonymous Coward on Thursday February 08 2018, @06:31PM (#635098)

        there is no choice but to accept it. however the options you presented are only the means of which one deals with it.

        shutting in as a hermit is clearly an indicator, if not confirmation, of not wishing to be surveilled. It states nothing about an action to stop the surveilling.

        • (Score: 2) by takyon on Thursday February 08 2018, @06:46PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @06:46PM (#635111) Journal

          To be more specific, I meant that another option is to go about your business as usual with the surveillance state intact, rather than becoming a hermit or shut-in. ie. do nothing to try to avoid it.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]