Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Thursday February 08 2018, @02:07AM   Printer-friendly
from the porn-with-morals dept.

The AI porn purge continues:

Pornhub will be deleting "deepfakes" — AI-generated videos that realistically edit new faces onto pornographic actors — under its rules against nonconsensual porn, following in the footsteps of platforms like Discord and Gfycat. "We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it," the company told Motherboard, which first reported on the deepfakes porn phenomenon last year. Pornhub says that nonconsensual content includes "revenge porn, deepfakes, or anything published without a person's consent or permission."

Update: The infamous subreddit itself, /r/deepfakes, has been banned by Reddit. /r/CelebFakes and /r/CelebrityFakes have also been banned for their non-AI porn fakery (they had existed for over 7 years). Other subreddits like /r/fakeapp (technical support for the software) and /r/SFWdeepfakes remain intact. Reported at Motherboard, The Verge, and TechCrunch.

Motherboard also reported on some users (primarily on a new subreddit, /r/deepfakeservice) offering to accept commissions to create deepfakes porn. This is seen as more likely to result in a lawsuit:

Bringing commercial use into the deepfakes practice opens the creator up to a lawsuit on the basis of right of publicity laws, which describe the right of an individual to control the commercial use of their name, likeness, or any other unequivocal aspect of their identity, legal experts told me.

"The videos are probably wrongful under the law whether or not money is exchanged," Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute think tank, told me. "But what's important is that the commercial exchange creates a focal point for tracing and hopefully stopping this activity. It might be easy to be anonymous on the internet, but it's a lot harder when you want to be paid."

[...] David Greene, Civil Liberties Director at the Electronic Freedom Foundation, told me on the phone that buying and selling, like everything with deepfakes, may be clearly unsavory behavior, but not necessarily illegal. "I want to separate something that's probably a dumb legal idea from something that's just a socially bad thing to do," Greene said. "If you're doing it to harass somebody, it's certainly a bad idea legally and socially."

Update: However, /r/deepfakeservice has also been hit with the banhammer. Looks like "deepfakes" will soon become "darkwebfakes".

Previously: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"

Related: Linux Use on Pornhub Surged 14% in 2016
Pornhub's Newest Videos Can Reach Out and Touch You
Pornhub Adopts Machine Learning to Tag Videos as Malvertising Looms
Pornhub's First Store has a Livestreaming Bed Camera, of Course


Original Submission

Related Stories

Linux Use on Pornhub Surged 14% in 2016 25 comments

Submitted via IRC for TheMightyBuzzard

Pornhub is one of the pre-eminent porn sites on the web. Each year Pornhub releases a year in review post with anonymous details about the site's users. More and more Linux users are visiting Pornhub, Linux saw an impressive 14% increase in traffic share in 2016.

[...] While Windows continues to dominate when it comes to which operating system users count on to watch Pornhub (about 80% of desktop users), Mac OS and Linux are on the rise, with Mac OS up 8% in traffic share and Linux up an impressive 14%.

Moving onto mobile. The playing field is pretty even here, with Android and Apple iOS almost at par with one another. Android leading with 3% more users on Pornhub than Apple iOS (47% of Pornhub's mobile users). Android's mobile market share has increase by 5% over the last year.

Look, it wasn't all me. I swear.

Source: http://www.infoworld.com/article/3158159/linux/linux-use-on-pornhub-surged-14-in-2016.html


Original Submission

Pornhub's Newest Videos Can Reach Out and Touch You 21 comments

The future of sex could be pretty interactive, but it's starting with men.

Top porn streaming company Pornhub announced a new channel of interactive videos that will work with the latest generation of connected male sex toys. The videos, according Pornhub, will work with the Fleshlight Launch and Kiiroo Onyx, featuring "an eclectic mix of content, offering an assortment of themes and appealing to various target audiences."

Sex toys and content geared to women are arriving later on, working with OhMiBod, We-Vibe, Lovense Lush and Kiiroo Pearl, but Pornhub didn't confirm when.

The new interactive channel will also work with VR eventually, combining the synced content with immersive video.

Thank goodness. I'm fresh out of latinum for Quark's holosuites.


Original Submission

Pornhub Adopts Machine Learning to Tag Videos as Malvertising Looms 17 comments

Pornhub has begun to use machine learning to automatically tag videos:

Artificial intelligence has proven to be a dab hand at recognizing what's going on in photos and videos, but the datasets it's usually trained on are pretty genteel. Not so for Pornhub, which announced today that it's using machine learning to automatically catalog its videos.

The site is starting small, deploying facial recognition software that will detect 10,000 individual porn stars and tag them in footage. (Usually this information is provided by uploaders and viewers, who will still play a part by verifying the software's choices.) It plans to scan all 5 million of its videos "within the next year," and then move onto more complicated territory: using the software to identify the specific categories videos belong to, like "public" and "blonde."

In a press statement, Pornhub VP Corey Price said the company was joining the trend of firms using AI to "expedite antiquated processes." However, the speed at which PornHub's AI processes the data doesn't seem like it would be an improvement on its current crowdsourced system. While in beta the machine learning software apparently scanned some 50,000 videos in a month. At this rate it would take nearly a decade to scan the entire site, but presumably improvements are being made.

Meanwhile, a security firm has warned that millions of Pornhub users were targeted by "malvertising" for more than a year:

Millions of Pornhub users were targeted with a malvertising attack that sought to trick them into installing malware on their PCs, according to infosec firm Proofpoint.

By the time the attack was uncovered, it had been active "for more than a year", Proofpoint said, having already "exposed millions of potential victims in the US, Canada, the UK, and Australia" to malware by pretending to be software updates to popular browsers.

Although Pornhub, the world's largest pornography site with 26bn yearly visits according to data from ranking firm Alexa, and its advertising network have shut down the infection pathway, the attack is still ongoing on other sites.

Also at TechCrunch, Engadget, and The Sacremento Bee.

Related: BugReplay - Finding How Ads Get Past the Blockers
Linux Use on Pornhub Surged 14% in 2016
Malvertising Campaign Finds a Way Around Ad Blockers
Pornhub's Newest Videos Can Reach Out and Touch You


Original Submission

Pornhub's First Store has a Livestreaming Bed Camera, of Course 19 comments

Pop-up stores are all the rage, but Pornhub's shop in New York City is offering something... unique. If you visit its just-opened location on 70 Wooster Street, you'll see a bed with a camera that livestreams directly to the porn giant's website. No, you can't get away with what normally happens on a bed at Pornhub, but you are encouraged to "interact" with the camera. And let's be honest: this is probably your best shot at appearing live on a porn site without having to explain a surprise career move.

The store itself (which, unsurprisingly, is adults-only) is also notable as Pornhub's retail debut. And it's mostly about fashion. You'll see some sex toys and "aphrodisiac herbs," but most of the wares are either self-branded clothing or apparel from Pornhub's partners. The company knowingly set up shop next to high fashion brands, in fact. While no one would confuse the porn purveyor with its haute couture neighbors, the company clearly wants to be taken seriously.

If you're curious enough to step inside, the New York pop-up will be open until December 20th. There will also be a "holiday-themed" store in Milan, Italy before long.

Source: https://www.engadget.com/2017/11/25/pornhub-store-includes-livestreaming-bed-camera/


Original Submission

AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit 48 comments

Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence.

Back in December, the unsavory hobby of a Reddit user by the name of deepfakes became a new centerpiece of artificial intelligence debate, specifically around the newfound ability to face-swap celebrities and porn stars. Using software, deepfakes was able to take the face of famous actresses and swap them with those of porn actresses, letting him live out a fantasy of watching famous people have sex. Now, just two months later, easy-to-use applications have sprouted up with the ability to perform this real-time editing with even more ease, according to Motherboard, which also first reported about deepfakes late last year.

Thanks to AI training techniques like machine learning, scores of photographs can be fed into an algorithm that creates convincing human masks to replace the faces of anyone on video, all by using lookalike data and letting the software train itself to improve over time. In this case, users are putting famous actresses into existing adult films. According to deepfakes, this required some extensive computer science know-how. But Motherboard reports that one user in the burgeoning community of pornographic celebrity face swapping has created a user-friendly app that basically anyone can use.

The same technique can be used for non-pornographic purposes, such as inserting Nicolas Cage's face into classic movies. One user also "outperformed" the Princess Leia scene at the end of Disney's Rogue One (you be the judge, original footage is at the top of the GIF).

The machines are learning.


Original Submission

Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn" 27 comments

The messaging platform Discord has taken down a channel that was being used to share and spread AI-edited pornographic videos:

Last year, a Reddit user known as "deepfakes" used machine learning to digitally edit the faces of celebrities into pornographic videos, and a new app has made the process much easier to create and spread the videos online. on Friday, chat service Discord shut down a user-created group that was spreading the videos, citing their policy against revenge porn.

Discord is a free chat platform that caters to gamers, and has a poor track record when it comes to dealing with abuse and toxic communities. After it was contacted by Business Insider, the company took down the chat group, named "deepfakes."

Discord is a Skype/TeamSpeak/Slack alternative. Here are some /r/deepfakes discussions about the Discord problem.

One take is that there is no recourse for "victims" of AI-generated porn, at least in the U.S.:

People Can Put Your Face on Porn—and the Law Can't Help You

To many vulnerable people on the internet, especially women, this looks a whole lot like the end times. "I share your sense of doom," Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, and also serves as the tech and legislative policy advisor for the Cyber Civil Rights Initiative. "I think it is going to be that bad."

Merkel Trump Deepfake

Previously: AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit


Original Submission

My Struggle With Deepfakes 14 comments

There has been some controversy over Deepfakes, a process of substituting faces in video. Almost immediately, it was used for pornography. While celebrities were generally unamused, porn stars were alarmed by the further commodification of their rôle. The algorithm is widely available and several web sites removed objectionable examples. You know something is controversial when porn sites remove it. Reddit was central for Deepfakes/FakeApp tech support and took drastic action to remove discussion after it started to become synonymous with fictitious revenge porn and other variants of anti-social practices.

I found a good description of the deepfakes algorithm. It runs via a standard neural network library but requires considerable processing power on specific GPUs. I will describe the video input (with face to be removed) as the source and the face to be replaced as the target. The neural network is trained with the target face only. The source is distorted and the neural network is trained to approximate reference images of the target. When the neural network is given the source, it has been trained to "undistort" the source to target.

[Continues...]

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by takyon on Thursday February 08 2018, @02:09AM (1 child)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @02:09AM (#634631) Journal

    People are blaming this guy [reddit.com] for the subreddit(s) getting taken down. Of course, Reddit was facing some significant (?) bad press over AI celeb porn and would have likely used the banhammer eventually.

    Not only does it seem you internally sabotaged a subreddit for what was as far as I can tell an issue hugely blown out of proportion and easily solvable internally, you're now giving the tech behind DeepFakes a bad rep to uninformed people.

    I'm certainly not claiming that it wasn't already viewed in a negative light by a good chunk of people (after all it was introduced and mostly used for fake celeb porn), but tacking on supposed child porn which caused a site ban certainly isn't going to help change that. You're now suggesting that two subs (r/fakeapp and r/facesets) also get banned despite being completely SFW.

    Compare to:

    https://en.wikipedia.org/wiki/Controversial_Reddit_communities [wikipedia.org]

    In August 2014, Reddit users began sharing a large number of naked pictures of celebrities stolen, using phishing, from their private Apple iCloud accounts. A subreddit, /r/TheFappening, was created as a hub to share and discuss these stolen photos; the situation was called CelebGate by the media. The subreddit contained most of the images. Victims of "The Fappening" included high-profile names such as Jennifer Lawrence and Kate Upton. Some of the images may have constituted child pornography, as the photos of Liz Lee and McKayla Maroney from the leak were claimed to have been taken when the women were underage, though this remains controversial. The subreddit was closed by Reddit administrators in September 2014. The scandal led to wider criticisms concerning the website's moderation from The Verge and The Daily Dot.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 4, Funny) by maxwell demon on Thursday February 08 2018, @07:10AM

      by maxwell demon (1608) on Thursday February 08 2018, @07:10AM (#634760) Journal

      In August 2014, Reddit users began sharing a large number of naked pictures

      Ah, how could they share the pictures without putting a frame around them! :-)

      --
      The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 4, Interesting) by Anonymous Coward on Thursday February 08 2018, @03:17AM (9 children)

    by Anonymous Coward on Thursday February 08 2018, @03:17AM (#634657)

    I'd be interested in hearing why the supporters of this move feel one twin couldn't exercise the same right as they presumably claim the celebrity subjects have in order to prevent the other voluntarily sending porn of themselves to their SO.

    Either people can censor works which appear similar to their face, or they can't. The existence of others with the same face can't impact that right, or it was never a right.

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:57AM (6 children)

      by Anonymous Coward on Thursday February 08 2018, @04:57AM (#634680)

      This is a rather poor argument to make. The two situations are completely different. With twins, you'd have to have an actual identical twin in order for this to work and most celebrities don't have a twin, identical or not.

      Secondly, this violates a person's right to control their own image. At least when it comes to twins, the twin has to actually look like their twin in order for it to possibly work, with the AI stuff, you just need sufficient images of the person in order to create a replacement face for somebody else that has a similar body type. Not anywhere near as rare.

      • (Score: 5, Insightful) by janrinok on Thursday February 08 2018, @07:03AM (4 children)

        by janrinok (52) Subscriber Badge on Thursday February 08 2018, @07:03AM (#634756) Journal

        a person's right to control their own image

        I must have missed this one somewhere. Where is this 'right' enshrined?

        So Weird Al's take off of Michael Jackson is illegal because of a 'right' that is applicable worldwide? I've checked the UN site - the most globally applicable site even if most people do not accept it as such - and I can find nothing. Or is the claimed right only being taken away because it involves 'porn'? In which case, I could make a deep fake of someone committing murder but that would be OK, is that it?

        If such a right exists, how come no famous person has taken legal action against the makers of deepfakes yet. Surely, this is a money making opportunity in the US that the rich and famous cannot afford to miss. Or perhaps they have and I, living outside the US, haven't heard about it because the 'right' that you are claiming doesn't actually exist elsewhere in the world?

        I'm not saying that I think that deepfakes of celebrities in acts of pornography is a great idea, but I am pointing out that many people seem to believe that they have rights that don't actually exist. There are legal measures already in existence for people to address problems of defamation of character, slander or libel, but suggesting that there is an all-encompassing right to one's personal appearance enshrined in law somewhere is perhaps a bit misleading. And defamation of character might be a tricky one to prove if it transpires that the star in question did perform sexual acts in her earlier days in a bid to help her achieve the fame she sought.

        • (Score: 1, Informative) by Anonymous Coward on Thursday February 08 2018, @08:39AM (1 child)

          by Anonymous Coward on Thursday February 08 2018, @08:39AM (#634784)

          While I don't disagree with the substance of your post, it is well known that Weird Al always gets permission from the subjects of his parodies.

          • (Score: 2) by Grishnakh on Thursday February 08 2018, @04:26PM

            by Grishnakh (2831) on Thursday February 08 2018, @04:26PM (#634986)

            Not only that, but doesn't (didn't) Weird Al also sometimes get the original band members to play for him?

        • (Score: 3, Insightful) by Wootery on Thursday February 08 2018, @11:14AM (1 child)

          by Wootery (2341) on Thursday February 08 2018, @11:14AM (#634819)

          Where is this 'right' enshrined?

          In the US? Varies by state, apparently: https://en.wikipedia.org/w/index.php?title=Personality_rights&oldid=823760789#United_States [wikipedia.org]

          • (Score: 2) by janrinok on Thursday February 08 2018, @03:52PM

            by janrinok (52) Subscriber Badge on Thursday February 08 2018, @03:52PM (#634965) Journal

            This raised a couple of questions:

            If the right is not recognised world-wide but is, in fact, only applicable in the US (and even then the extent of applicability varies from state to state), then would it apply to, say, Emma Watson? She is a UK citizen. While she is in the US, she has to comply with US law, but for anyone else in the world, they could use her image because no offence under such laws is being committed. Nor could she claim protection under US law while she is outside of the US, for she is not American. US law is not applicable to non-US citizens outside of the US, a fact that seems to be often overlooked in discussions here. And, increasingly, non-US governments are baulking at US requests to have people extradited for actions that are not offences in the country in which the alleged crime took place, or would prefer to have the punishment meted out in their own courts rather than face the plea-bargaining and extreme measures that are being seen within the US.

            Following on, even if the deepfake image is made of a US personality by someone outside of the US then, again, perhaps no law has been contravened. The claimed 'right' to control one's image only exists in the US under US law and is not applicable to anyone else. While this will prevent US websites and US citizens from being able to legally produce deepfake images, it might have no effect on their production elsewhere. Any US citizen who has been the subject of a deepfake image must first ascertain who committed the perceived offence and then see if any laws have been broken in the country in which the image was made.

            Again, I stress that I do not support the creation of such images, but I don't think that the current moves on the part of Pornhub et al are going to have much of a dampening effect on their production.

      • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:57PM

        by Anonymous Coward on Thursday February 08 2018, @04:57PM (#635015)

        Why do you feel considering non-extant situations cannot give insight? Physics students sure spend a lot of time on hypothetical situations, I'm sure they'd be glad to realize they don't need to spend all that time considering frictionless surfaces as they can never occur and so nothing useful can be gleaned from their consideration.

        Why are the situations different? You fail to argue for that point.

        this violates a person's right to control their own image

        My whole point is that such a right cannot sensibly exist. You're begging the question.

        You seem to think that a situation being rare exempts it from consideration, see the first point.

    • (Score: 2) by DeathMonkey on Thursday February 08 2018, @06:23PM (1 child)

      by DeathMonkey (1380) on Thursday February 08 2018, @06:23PM (#635088) Journal

      I'd be interested in hearing why the supporters of this move feel one twin couldn't exercise the same right...

      Is there anyone suggesting a twin shouldn't be entitled to his/her own personality rights? Sounds like a strawman to me....

      • (Score: 0) by Anonymous Coward on Friday February 09 2018, @03:46PM

        by Anonymous Coward on Friday February 09 2018, @03:46PM (#635529)

        I'm suggesting that the implications of enshrining the right to control media which appears to depict one in law are worse than those of not doing so, and implying the supporters haven't given due consideration to the outcome of their proposed solution.

        "If you treat this as a right, then you're forced into accepting this distasteful situation." (which I imply is worse than the problem you're trying to solve by granting that right)

  • (Score: 2, Interesting) by Anonymous Coward on Thursday February 08 2018, @03:19AM (11 children)

    by Anonymous Coward on Thursday February 08 2018, @03:19AM (#634658)

    Really, the best thing would be to allow this kind of porn.

    The quicker people get acclimated to the fact that deepfakes are fake, the better.

    In the far flung future, data will just be data.

    • (Score: 4, Insightful) by takyon on Thursday February 08 2018, @03:36AM (10 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @03:36AM (#634660) Journal

      The centralized web giants have gone from censoring illegal content to censoring content that makes people feel bad. The implications of widespread fakery don't matter to them.

      "Involuntary pornography" is voluntary. One party volunteered to get naked on camera, and another party volunteered to make their face known by publishing photos or videos of it, or walking around in public (a bigger concern with celebrities, but everyone is on camera now). Finally, some random person on the web volunteered to mash the two together.

      A lot of the appeal here seems to be the desire for celebrities who are "hard to get" (don't appear nude). But machine learning will also lead to the creation of "virtual actors" who mix and match traits from so many people that they can't be tied to any specific one. Or they can just move the faux celeb nudes off of Giphy/Reddit/Cornhub/etc. and onto decentralized platforms or overseas ones that don't care.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 1, Insightful) by Anonymous Coward on Thursday February 08 2018, @05:00AM (5 children)

        by Anonymous Coward on Thursday February 08 2018, @05:00AM (#634681)

        Sigh, really. No, it's not voluntary, somebody's face is now pasted on somebody else's body making it appear that they were engage in acts that they weren't engaged in. For many celebrities, their ability to earn money is based in part on the image that they've cultivated.

        For people that aren't celebrities, this is a creepy tool that can be used to create porn of somebody that didn't actually consent to be involved in porn. This is sort of like when a director uses a body double in a move to make it appear as though the actor was naked. Common decency alone ought to be sufficient to tell people, that it's not OK to create porn that appears to depict somebody that didn't consent to being in porn.

        • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @06:17AM

          by Anonymous Coward on Thursday February 08 2018, @06:17AM (#634723)

          Another critically threatened faction is the porn actor. With this evolving technology, a virtual perfect ( for each individual's taste, no less! ) porn actor will perform any stunt - even stunts not physically possible given human structure.

          A real "wonder woman", actually bending large pipes, using humongous pectoralis major muscles, which grow and shrink according to the state of her energy levels... biceps the shape and size of baseballs erupt on tension, calf muscles tightly bunching up pulling their achilles tendon tight as tow ropes. Yet all melting back to voluptious womanhood when relaxed...

          No real human is capable of doing such illusions, short of Hollywood magic.. but apparently anything anyone can even imagine, every erotic fantasy, will be imaged. Morphs to bestiality... child porn of completely synthesized features... anything anyone can dream up!

          Would any existing human mate physically measure up?

          It was bad enough having advertising agencies set the bar for what people should look like. Even today, the instant I turn on the TV, there is some advertiser selling the treatment for something that he says others think is ugly. I stink. I have stubble. Bags under my eyes! A roll of fat here or there. Too fat. Too thin. Not in fashion. Not seen in the right place. No-ones gonna want me, so I better buy now while their stuff is still available. We'll double the offer, just pay separate fee.

          Just wait for the "joystick" and "joystick receptacle" and other lifelike sex toys/dolls to make the scene... bluetooth enabled for sync to the VR headset.

          I guess the government will have to start paying people to procreate the way its been done since life came into existence on this planet - as the "real" thing won't be nearly as much fun as playing around with the "imaginary" thing.... not only that, you will end up with kids to raise!

          Yes, I know this whole thing looks way too much like those spam posts that have been showing up here.

        • (Score: 3, Insightful) by takyon on Thursday February 08 2018, @06:45AM (2 children)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @06:45AM (#634742) Journal

          You talk about common decency, but how about legality? Personality rights [wikipedia.org] seem to apply only to the use of someone's image for commercial purposes. That might be applicable to faceswap porn if there is ad revenue or a commission involved. Absent that, if it's just a labor of "love", then it doesn't seem to violate any laws or infringe on "personality rights" (which could be wiped out by the Supreme Court).

          Common decency alone is not sufficient to suppress activities that are legal. The person running their GTX 1080 GPU hot for 12 hours in order to swap a celeb's face onto a porn star's body has transcended the shackles of common decency. The best you can get them for is probably a copyright violation... of the porn producer's copyright, which could lead to a DMCA cat and mouse game. But it would likely go unnoticed if the porn company doesn't detect it, especially since the new work could be relabeled without the original name.

          If it isn't illegal to use someone's likeness, then the person contributes that likeness by venturing out into public and being a target of photography.

          Could it be illegal to train the AI using frames of someone's face ripped from a Hollywood movie? It could be very hard to reverse the process and prove where the imagery came from.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by DeathMonkey on Thursday February 08 2018, @06:02PM (1 child)

            by DeathMonkey (1380) on Thursday February 08 2018, @06:02PM (#635070) Journal

            This is from your own link:

            In the United States, the right of publicity is a state law-based right, as opposed to federal, and recognition of the right can vary from state to state.[30] The rationale underlying the right of publicity in the United States is rooted in both privacy and economic exploitation.[31] The rights are based in tort law, and the four causes of action are: 1) Intrusion upon physical solitude; 2) public disclosure of private facts; 3) depiction in a false light; and 4) appropriation of name and likeness.

            3 and 4 would both seem to apply.

            • (Score: 2) by takyon on Thursday February 08 2018, @06:24PM

              by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @06:24PM (#635089) Journal

              From the source [rightofpublicity.com] at the end of that paragraph:

              The Right of Publicity is likely to experience continued evolution in 2018, primarily from Right of Publicity legislation and case law (often involving fascinating fact patterns). One could also observe that the U.S. now has a President who entered office with more Right of Publicity licensing than any prior elected official. In short, it is evident that the Right of Publicity will continue generating vigorous debate. The Right of Publicity is often misunderstood, in part because recognition of the Right of Publicity varies significantly from state to state and country to country.

              The laws vary by state. What if someone who doesn't live in California or Indiana creates the porn? The laws also describe civil matters, not criminal. The federal Lanham Act deals with commercial appropriation of someone's likeness. The laws could be made obsolete by the Supreme Court in the coming years.

              I don't see a lot of recourse for the "victims". Suing users successfully will be difficult. And they may trigger a Streisand effect by trying to get their faked porn down.

              People Can Put Your Face on Porn—and the Law Can't Help You [wired.com]

              Franks helped write much of the US’s existing legislation that criminalizes nonconsensual porn—and it's not going to help. It’s not that Franks and lawmakers weren’t thinking about the implications of manipulated images. It’s that the premise of any current legislation is that nonconsensual porn is a privacy violation. Face-swap porn may be deeply, personally humiliating for the people whose likeness is used, but it's technically not a privacy issue. That's because, unlike a nude photo filched from the cloud, this kind of material is bogus. You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.

              And it's the very artifice involved in these videos that provides enormous legal cover for their creators. “It falls through the cracks because it’s all very betwixt and between,” says Danielle Citron, a law professor at the University of Maryland and the author of Hate Crimes in Cyberspace. “There are all sorts of First Amendment problems because it’s not their real body.” Since US privacy laws don’t apply, taking these videos down could be considered censorship—after all, this is “art” that redditors have crafted, even if it’s unseemly.

              [...] Does that mean that victims have zero hope of legal recourse? Not exactly. Celebrities will be able to sue for the misappropriation of their images. But that usually applies to commercial contexts—like, say, if someone took a social media photo of Gal Gadot’s and then used it to promote a strip club without her consent—and commercial speech doesn’t have nearly the protection individual citizens’ does.

              For the average citizen, your best hope is anti-defamation law. When Franks realized that revenge porn law wouldn't include language about false images, she recommended that lawmakers update their anti-defamation statutes to handle it—but in many cases, that hasn’t happened yet. And Franks thinks claimants will have difficulty proving that the creators intended to cause them emotional distress. So far, these videos do seem to have been created for the pleasure of the creator rather than the humiliation of the object of their desire. “Inevitably, someone will point out how many young men had posters of Princess Leia in their bedrooms as a masturbation fantasy,” Franks says. “Is the harm just that you found out about? Legally, we need to be able to articulate what is the harm, not just that it makes us feel icky.” And in such a fringe case as AI-enabled porn, that hasn’t happened yet.

              --
              [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @07:08PM

          by Anonymous Coward on Thursday February 08 2018, @07:08PM (#635128)

          lmao! well stealing people's face is at least assault with a deadly weapon so you know...

      • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @09:32AM (3 children)

        by Anonymous Coward on Thursday February 08 2018, @09:32AM (#634794)

        or walking around in public (a bigger concern with celebrities, but everyone is on camera now).

        Yeah, and all they have to do to avoid that is to become a shut-in or live as a hermit away from society. Three cheers for our mass surveillance society! This is such a good thing.

        • (Score: 2) by takyon on Thursday February 08 2018, @01:48PM (2 children)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @01:48PM (#634911) Journal

          That's about where we're at.

          Another option is to quietly accept it. Many Americans have.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @06:31PM (1 child)

            by Anonymous Coward on Thursday February 08 2018, @06:31PM (#635098)

            there is no choice but to accept it. however the options you presented are only the means of which one deals with it.

            shutting in as a hermit is clearly an indicator, if not confirmation, of not wishing to be surveilled. It states nothing about an action to stop the surveilling.

            • (Score: 2) by takyon on Thursday February 08 2018, @06:46PM

              by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday February 08 2018, @06:46PM (#635111) Journal

              To be more specific, I meant that another option is to go about your business as usual with the surveillance state intact, rather than becoming a hermit or shut-in. ie. do nothing to try to avoid it.

              --
              [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 4, Insightful) by cubancigar11 on Thursday February 08 2018, @04:19AM (4 children)

    by cubancigar11 (330) on Thursday February 08 2018, @04:19AM (#634667) Homepage Journal

    https://www.youtube.com/watch?v=IrrADTN-dvg [youtube.com]

    Has anyone else notice how the the leftist demands are actually puritanically victorian?

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:23AM

      by Anonymous Coward on Thursday February 08 2018, @04:23AM (#634668)

      They know best. Submit to Allah Dear Leader.

      Hey! Don't forget to mark this "Troll".

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:49AM

      by Anonymous Coward on Thursday February 08 2018, @04:49AM (#634676)

      Where, where but here have Pride and Truth,
      That long to give themselves for wage,
      To shake their wicked sides at youth
      Restraining reckless middle-age?

    • (Score: 2) by crafoo on Thursday February 08 2018, @05:52PM (1 child)

      by crafoo (6639) on Thursday February 08 2018, @05:52PM (#635064)

      I've started using Regressive Authoritarians to refer to these people. Not really liberals or leftists, although they may also label themselves as such.

      • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @09:09PM

        by Anonymous Coward on Thursday February 08 2018, @09:09PM (#635201)

        Then again, what USA considers left, and what the rest of the world considers left, are two very different things...

  • (Score: 4, Interesting) by Anonymous Coward on Thursday February 08 2018, @04:46AM (17 children)

    by Anonymous Coward on Thursday February 08 2018, @04:46AM (#634675)

    Assume its wrong to create or distribute an 'image of a person' without their consent.

    What if an RNG spits out an image file which happens to look like a photo of a person; is it an image of them?
    What if a photo is taken of a person; it is an image of them?
    Given the image is the same in both cases, if your answers to the above differ then you do not believe that images can intrinsically be of people.

    One must therefore believe either a) both are images of the lookalike; b) neither are images of the lookalike; or c) images can't intrinsically be of people.
    A is absurd due to it making running an RNG morally wrong without first blacklisting numbers corresponding to images which look sufficiently like anyone for whom one lacks consent to creates images thereof.
    B is absurd given it claims a photographic portrait isn't an image of the subject.
    C will therefore be the only option further considered.

    Both images are uploaded, and since they are identical only one copy is stored and two softlinks to it: /rng and /photo.
    By our assumption it is moral to host /photo, and immoral to host /rng. So far, so good.
    Is it moral to host the underlying raw sectors which both links point to? Does the answer depend on the nature of the deduplication scheme rather than the content of those sectors?

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:57AM

      by Anonymous Coward on Thursday February 08 2018, @04:57AM (#634679)

      "By our assumption it is moral to..." mixed up /photo and /rng, it ought state hosting /rng is moral and /photo isn't.

      I also failed to account for the case where one considers the RNG output an image of the person, but not the photo of them.

      It's 5am though, and I don't really care since there's no putting this cat back in the bag.

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @05:25AM (5 children)

      by Anonymous Coward on Thursday February 08 2018, @05:25AM (#634694)

      Unfortunately for your argument, the law and ethics care about providence (where the data came from). Copying something is treated differently by the law than creating something that happens to be identical to that copy.

      • (Score: 2) by MostCynical on Thursday February 08 2018, @07:22AM (3 children)

        by MostCynical (2589) on Thursday February 08 2018, @07:22AM (#634766) Journal

        Star Trek replicators and transporter beams are unethical and illegal under US copyright.
        They only produce copies!

        (Also, you murdered the original person, so you broke another law)

        One extreme: The US puritanicalism, at the other: France and Italy (the powerful politician *doesn't* have a mistress? What is wrong with him?!)

        --
        "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
        • (Score: 2) by maxwell demon on Thursday February 08 2018, @07:27AM (2 children)

          by maxwell demon (1608) on Thursday February 08 2018, @07:27AM (#634770) Journal

          Star Trek replicators and transporter beams are unethical and illegal under US copyright.
          They only produce copies!

          Star Trek replicators can reproduce stuff that is not under copyright, and in that case won't violate copyright. Of course they could also be used to violate copyright.

          --
          The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @07:47AM (1 child)

            by Anonymous Coward on Thursday February 08 2018, @07:47AM (#634775)

            Star Trek replicators can reproduce stuff that is not under copyright, and in that case won't violate copyright. Of course they could also be used to violate copyright.

            You know, like the printing press. Or a modern printer.

            • (Score: 2) by maxwell demon on Thursday February 08 2018, @08:03AM

              by maxwell demon (1608) on Thursday February 08 2018, @08:03AM (#634778) Journal

              Which both are not illegal under US copyright. Thank you for supporting my point.

              --
              The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:43PM

        by Anonymous Coward on Thursday February 08 2018, @04:43PM (#634999)

        Is it moral to host the underlying raw sectors which both links point to? Does the answer depend on the nature of the deduplication scheme rather than the content of those sectors?

        The implication being it's absurd to consider the source of data as significant.

    • (Score: 2) by maxwell demon on Thursday February 08 2018, @07:24AM (2 children)

      by maxwell demon (1608) on Thursday February 08 2018, @07:24AM (#634768) Journal

      What if an RNG spits out an image file which happens to look like a photo of a person; is it an image of them?

      Unless the RNG was specifically biased using that face, no it isn't. However it is extremely unlikely that this happens, and therefore you'll have a hard time arguing that way. Unless you can give convincing evidence that an image of that person, or your knowledge about that person's image, did not enter the creation.

      What if a photo is taken of a person; it is an image of them?

      Yes, of course.

      How the thing came into being indeed does matter. It's just like in copyright: If you write something and it happens by chance to match something that someone else wrote, then you are not violating that other person's copyright. However you better have convincing evidence of that. Clean room implementations are exactly about documenting that the implementer did not have access to the copyrighted stuff.

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:46PM

        by Anonymous Coward on Thursday February 08 2018, @04:46PM (#635002)

        Is it moral to host the underlying raw sectors which both links point to? Does the answer depend on the nature of the deduplication scheme rather than the content of those sectors?

      • (Score: 2) by tangomargarine on Thursday February 08 2018, @06:30PM

        by tangomargarine (667) on Thursday February 08 2018, @06:30PM (#635096)

        Unless the RNG was specifically biased using that face, no it isn't. However it is extremely unlikely that this happens, and therefore you'll have a hard time arguing that way. Unless you can give convincing evidence that an image of that person, or your knowledge about that person's image, did not enter the creation.

        You could seed/bias/whatever the RNG by giving it a bunch of pictures of people who *did* consent to it, which would make it much more likely to spit out similar human-shaped images.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 2) by Non Sequor on Thursday February 08 2018, @02:43PM (1 child)

      by Non Sequor (1005) on Thursday February 08 2018, @02:43PM (#634931) Journal

      Algorithmic information theory says that even though the probability of any specific string is the same as any other the probability of generating a recognizable picture with an RNG is negligible and the probability of unrecognizable noise is almost certain. At reasonable resolutions, negligible is going to mean it’s completely impractical to generate an image using a uniform distribution. You could generate one using a reference data set to essentially generate a prior distribution of images more specific than the uniform distribution. That’s essentially what a deep fake is.

      --
      Write your congressman. Tell him he sucks.
      • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:49PM

        by Anonymous Coward on Thursday February 08 2018, @04:49PM (#635007)

        Sorry, your comment relies on considering what happens if people generate and save large volumes of random bits and attempt to interpret them as images. Given this isn't going to realistically happen I consider any argument based on it unworthy of consideration.

        /sarcasm

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @04:29PM

      by Anonymous Coward on Thursday February 08 2018, @04:29PM (#634989)

      All possible child porn exists in the RNG.

    • (Score: 2) by DeathMonkey on Thursday February 08 2018, @06:18PM (1 child)

      by DeathMonkey (1380) on Thursday February 08 2018, @06:18PM (#635084) Journal

      What if an RNG spits out an image file which happens to look like a photo of a person; is it an image of them?

      The Justice system contends itself with what did happen, not what might happen.

      So, I'd suggest you setup your RNG. Wait a million years. Then, enjoy your day in court!

      • (Score: 0) by Anonymous Coward on Friday February 09 2018, @03:50PM

        by Anonymous Coward on Friday February 09 2018, @03:50PM (#635532)

        I'm talking about what the law ought be, not what it is.

        The legislature exclusively concerns itself with what might happen and not at all (in a sane country) with what did happen.

    • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @09:20PM (1 child)

      by Anonymous Coward on Thursday February 08 2018, @09:20PM (#635209)

      Now we are heading into colour of bits grade thinking.

      http://ansuz.sooke.bc.ca/entry/23 [sooke.bc.ca]

      • (Score: 0) by Anonymous Coward on Saturday February 10 2018, @07:20AM

        by Anonymous Coward on Saturday February 10 2018, @07:20AM (#635917)

        That was a really interesting link.

  • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @12:01PM (2 children)

    by Anonymous Coward on Thursday February 08 2018, @12:01PM (#634831)

    Soooo.... Any porn images of Allah?

(1)