Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 14 submissions in the queue.
posted by janrinok on Thursday February 08 2018, @02:07AM   Printer-friendly
from the porn-with-morals dept.

The AI porn purge continues:

Pornhub will be deleting "deepfakes" — AI-generated videos that realistically edit new faces onto pornographic actors — under its rules against nonconsensual porn, following in the footsteps of platforms like Discord and Gfycat. "We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it," the company told Motherboard, which first reported on the deepfakes porn phenomenon last year. Pornhub says that nonconsensual content includes "revenge porn, deepfakes, or anything published without a person's consent or permission."

Update: The infamous subreddit itself, /r/deepfakes, has been banned by Reddit. /r/CelebFakes and /r/CelebrityFakes have also been banned for their non-AI porn fakery (they had existed for over 7 years). Other subreddits like /r/fakeapp (technical support for the software) and /r/SFWdeepfakes remain intact. Reported at Motherboard, The Verge, and TechCrunch.

Motherboard also reported on some users (primarily on a new subreddit, /r/deepfakeservice) offering to accept commissions to create deepfakes porn. This is seen as more likely to result in a lawsuit:

Bringing commercial use into the deepfakes practice opens the creator up to a lawsuit on the basis of right of publicity laws, which describe the right of an individual to control the commercial use of their name, likeness, or any other unequivocal aspect of their identity, legal experts told me.

"The videos are probably wrongful under the law whether or not money is exchanged," Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute think tank, told me. "But what's important is that the commercial exchange creates a focal point for tracing and hopefully stopping this activity. It might be easy to be anonymous on the internet, but it's a lot harder when you want to be paid."

[...] David Greene, Civil Liberties Director at the Electronic Freedom Foundation, told me on the phone that buying and selling, like everything with deepfakes, may be clearly unsavory behavior, but not necessarily illegal. "I want to separate something that's probably a dumb legal idea from something that's just a socially bad thing to do," Greene said. "If you're doing it to harass somebody, it's certainly a bad idea legally and socially."

Update: However, /r/deepfakeservice has also been hit with the banhammer. Looks like "deepfakes" will soon become "darkwebfakes".

Previously: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"

Related: Linux Use on Pornhub Surged 14% in 2016
Pornhub's Newest Videos Can Reach Out and Touch You
Pornhub Adopts Machine Learning to Tag Videos as Malvertising Looms
Pornhub's First Store has a Livestreaming Bed Camera, of Course


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Thursday February 08 2018, @06:24PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @06:24PM (#635089) Journal

    From the source [rightofpublicity.com] at the end of that paragraph:

    The Right of Publicity is likely to experience continued evolution in 2018, primarily from Right of Publicity legislation and case law (often involving fascinating fact patterns). One could also observe that the U.S. now has a President who entered office with more Right of Publicity licensing than any prior elected official. In short, it is evident that the Right of Publicity will continue generating vigorous debate. The Right of Publicity is often misunderstood, in part because recognition of the Right of Publicity varies significantly from state to state and country to country.

    The laws vary by state. What if someone who doesn't live in California or Indiana creates the porn? The laws also describe civil matters, not criminal. The federal Lanham Act deals with commercial appropriation of someone's likeness. The laws could be made obsolete by the Supreme Court in the coming years.

    I don't see a lot of recourse for the "victims". Suing users successfully will be difficult. And they may trigger a Streisand effect by trying to get their faked porn down.

    People Can Put Your Face on Porn—and the Law Can't Help You [wired.com]

    Franks helped write much of the US’s existing legislation that criminalizes nonconsensual porn—and it's not going to help. It’s not that Franks and lawmakers weren’t thinking about the implications of manipulated images. It’s that the premise of any current legislation is that nonconsensual porn is a privacy violation. Face-swap porn may be deeply, personally humiliating for the people whose likeness is used, but it's technically not a privacy issue. That's because, unlike a nude photo filched from the cloud, this kind of material is bogus. You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.

    And it's the very artifice involved in these videos that provides enormous legal cover for their creators. “It falls through the cracks because it’s all very betwixt and between,” says Danielle Citron, a law professor at the University of Maryland and the author of Hate Crimes in Cyberspace. “There are all sorts of First Amendment problems because it’s not their real body.” Since US privacy laws don’t apply, taking these videos down could be considered censorship—after all, this is “art” that redditors have crafted, even if it’s unseemly.

    [...] Does that mean that victims have zero hope of legal recourse? Not exactly. Celebrities will be able to sue for the misappropriation of their images. But that usually applies to commercial contexts—like, say, if someone took a social media photo of Gal Gadot’s and then used it to promote a strip club without her consent—and commercial speech doesn’t have nearly the protection individual citizens’ does.

    For the average citizen, your best hope is anti-defamation law. When Franks realized that revenge porn law wouldn't include language about false images, she recommended that lawmakers update their anti-defamation statutes to handle it—but in many cases, that hasn’t happened yet. And Franks thinks claimants will have difficulty proving that the creators intended to cause them emotional distress. So far, these videos do seem to have been created for the pleasure of the creator rather than the humiliation of the object of their desire. “Inevitably, someone will point out how many young men had posters of Princess Leia in their bedrooms as a masturbation fantasy,” Franks says. “Is the harm just that you found out about? Legally, we need to be able to articulate what is the harm, not just that it makes us feel icky.” And in such a fringe case as AI-enabled porn, that hasn’t happened yet.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2