Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Thursday February 08 2018, @02:07AM   Printer-friendly
from the porn-with-morals dept.

The AI porn purge continues:

Pornhub will be deleting "deepfakes" — AI-generated videos that realistically edit new faces onto pornographic actors — under its rules against nonconsensual porn, following in the footsteps of platforms like Discord and Gfycat. "We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it," the company told Motherboard, which first reported on the deepfakes porn phenomenon last year. Pornhub says that nonconsensual content includes "revenge porn, deepfakes, or anything published without a person's consent or permission."

Update: The infamous subreddit itself, /r/deepfakes, has been banned by Reddit. /r/CelebFakes and /r/CelebrityFakes have also been banned for their non-AI porn fakery (they had existed for over 7 years). Other subreddits like /r/fakeapp (technical support for the software) and /r/SFWdeepfakes remain intact. Reported at Motherboard, The Verge, and TechCrunch.

Motherboard also reported on some users (primarily on a new subreddit, /r/deepfakeservice) offering to accept commissions to create deepfakes porn. This is seen as more likely to result in a lawsuit:

Bringing commercial use into the deepfakes practice opens the creator up to a lawsuit on the basis of right of publicity laws, which describe the right of an individual to control the commercial use of their name, likeness, or any other unequivocal aspect of their identity, legal experts told me.

"The videos are probably wrongful under the law whether or not money is exchanged," Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute think tank, told me. "But what's important is that the commercial exchange creates a focal point for tracing and hopefully stopping this activity. It might be easy to be anonymous on the internet, but it's a lot harder when you want to be paid."

[...] David Greene, Civil Liberties Director at the Electronic Freedom Foundation, told me on the phone that buying and selling, like everything with deepfakes, may be clearly unsavory behavior, but not necessarily illegal. "I want to separate something that's probably a dumb legal idea from something that's just a socially bad thing to do," Greene said. "If you're doing it to harass somebody, it's certainly a bad idea legally and socially."

Update: However, /r/deepfakeservice has also been hit with the banhammer. Looks like "deepfakes" will soon become "darkwebfakes".

Previously: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"

Related: Linux Use on Pornhub Surged 14% in 2016
Pornhub's Newest Videos Can Reach Out and Touch You
Pornhub Adopts Machine Learning to Tag Videos as Malvertising Looms
Pornhub's First Store has a Livestreaming Bed Camera, of Course


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by Anonymous Coward on Thursday February 08 2018, @03:17AM (9 children)

    by Anonymous Coward on Thursday February 08 2018, @03:17AM (#634657)

    I'd be interested in hearing why the supporters of this move feel one twin couldn't exercise the same right as they presumably claim the celebrity subjects have in order to prevent the other voluntarily sending porn of themselves to their SO.

    Either people can censor works which appear similar to their face, or they can't. The existence of others with the same face can't impact that right, or it was never a right.

    Starting Score:    0  points
    Moderation   +4  
       Insightful=1, Interesting=3, Total=4
    Extra 'Interesting' Modifier   0  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:57AM (6 children)

    by Anonymous Coward on Thursday February 08 2018, @04:57AM (#634680)

    This is a rather poor argument to make. The two situations are completely different. With twins, you'd have to have an actual identical twin in order for this to work and most celebrities don't have a twin, identical or not.

    Secondly, this violates a person's right to control their own image. At least when it comes to twins, the twin has to actually look like their twin in order for it to possibly work, with the AI stuff, you just need sufficient images of the person in order to create a replacement face for somebody else that has a similar body type. Not anywhere near as rare.

    • (Score: 5, Insightful) by janrinok on Thursday February 08 2018, @07:03AM (4 children)

      by janrinok (52) Subscriber Badge on Thursday February 08 2018, @07:03AM (#634756) Journal

      a person's right to control their own image

      I must have missed this one somewhere. Where is this 'right' enshrined?

      So Weird Al's take off of Michael Jackson is illegal because of a 'right' that is applicable worldwide? I've checked the UN site - the most globally applicable site even if most people do not accept it as such - and I can find nothing. Or is the claimed right only being taken away because it involves 'porn'? In which case, I could make a deep fake of someone committing murder but that would be OK, is that it?

      If such a right exists, how come no famous person has taken legal action against the makers of deepfakes yet. Surely, this is a money making opportunity in the US that the rich and famous cannot afford to miss. Or perhaps they have and I, living outside the US, haven't heard about it because the 'right' that you are claiming doesn't actually exist elsewhere in the world?

      I'm not saying that I think that deepfakes of celebrities in acts of pornography is a great idea, but I am pointing out that many people seem to believe that they have rights that don't actually exist. There are legal measures already in existence for people to address problems of defamation of character, slander or libel, but suggesting that there is an all-encompassing right to one's personal appearance enshrined in law somewhere is perhaps a bit misleading. And defamation of character might be a tricky one to prove if it transpires that the star in question did perform sexual acts in her earlier days in a bid to help her achieve the fame she sought.

      • (Score: 1, Informative) by Anonymous Coward on Thursday February 08 2018, @08:39AM (1 child)

        by Anonymous Coward on Thursday February 08 2018, @08:39AM (#634784)

        While I don't disagree with the substance of your post, it is well known that Weird Al always gets permission from the subjects of his parodies.

        • (Score: 2) by Grishnakh on Thursday February 08 2018, @04:26PM

          by Grishnakh (2831) on Thursday February 08 2018, @04:26PM (#634986)

          Not only that, but doesn't (didn't) Weird Al also sometimes get the original band members to play for him?

      • (Score: 3, Insightful) by Wootery on Thursday February 08 2018, @11:14AM (1 child)

        by Wootery (2341) on Thursday February 08 2018, @11:14AM (#634819)

        Where is this 'right' enshrined?

        In the US? Varies by state, apparently: https://en.wikipedia.org/w/index.php?title=Personality_rights&oldid=823760789#United_States [wikipedia.org]

        • (Score: 2) by janrinok on Thursday February 08 2018, @03:52PM

          by janrinok (52) Subscriber Badge on Thursday February 08 2018, @03:52PM (#634965) Journal

          This raised a couple of questions:

          If the right is not recognised world-wide but is, in fact, only applicable in the US (and even then the extent of applicability varies from state to state), then would it apply to, say, Emma Watson? She is a UK citizen. While she is in the US, she has to comply with US law, but for anyone else in the world, they could use her image because no offence under such laws is being committed. Nor could she claim protection under US law while she is outside of the US, for she is not American. US law is not applicable to non-US citizens outside of the US, a fact that seems to be often overlooked in discussions here. And, increasingly, non-US governments are baulking at US requests to have people extradited for actions that are not offences in the country in which the alleged crime took place, or would prefer to have the punishment meted out in their own courts rather than face the plea-bargaining and extreme measures that are being seen within the US.

          Following on, even if the deepfake image is made of a US personality by someone outside of the US then, again, perhaps no law has been contravened. The claimed 'right' to control one's image only exists in the US under US law and is not applicable to anyone else. While this will prevent US websites and US citizens from being able to legally produce deepfake images, it might have no effect on their production elsewhere. Any US citizen who has been the subject of a deepfake image must first ascertain who committed the perceived offence and then see if any laws have been broken in the country in which the image was made.

          Again, I stress that I do not support the creation of such images, but I don't think that the current moves on the part of Pornhub et al are going to have much of a dampening effect on their production.

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:57PM

      by Anonymous Coward on Thursday February 08 2018, @04:57PM (#635015)

      Why do you feel considering non-extant situations cannot give insight? Physics students sure spend a lot of time on hypothetical situations, I'm sure they'd be glad to realize they don't need to spend all that time considering frictionless surfaces as they can never occur and so nothing useful can be gleaned from their consideration.

      Why are the situations different? You fail to argue for that point.

      this violates a person's right to control their own image

      My whole point is that such a right cannot sensibly exist. You're begging the question.

      You seem to think that a situation being rare exempts it from consideration, see the first point.

  • (Score: 2) by DeathMonkey on Thursday February 08 2018, @06:23PM (1 child)

    by DeathMonkey (1380) on Thursday February 08 2018, @06:23PM (#635088) Journal

    I'd be interested in hearing why the supporters of this move feel one twin couldn't exercise the same right...

    Is there anyone suggesting a twin shouldn't be entitled to his/her own personality rights? Sounds like a strawman to me....

    • (Score: 0) by Anonymous Coward on Friday February 09 2018, @03:46PM

      by Anonymous Coward on Friday February 09 2018, @03:46PM (#635529)

      I'm suggesting that the implications of enshrining the right to control media which appears to depict one in law are worse than those of not doing so, and implying the supporters haven't given due consideration to the outcome of their proposed solution.

      "If you treat this as a right, then you're forced into accepting this distasteful situation." (which I imply is worse than the problem you're trying to solve by granting that right)