Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Friday July 19 2019, @06:02AM   Printer-friendly
from the You-should-be-in-pictures!-Oh...wait. dept.

Viral App FaceApp Now Owns Access to more than 150 Million People's Faces and Names:

Everyone's seen them: friends posting pictures of themselves now, and years in the future.

Viral app FaceApp has been giving people the power to change their facial expressions, looks, and now age for several years. But at the same time, people have been giving FaceApp the power to use their pictures — and names — for any purpose it wishes, for as long as it desires.

[...] While according to FaceApp's terms of service people still own their own "user content" (read: face), the company owns a never-ending and irrevocable royalty-free license to do anything they want with it ... in front of whoever they wish:

You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you. When you post or otherwise share User Content on or through our Services, you understand that your User Content and any associated information (such as your [username], location or profile photo) will be visible to the public.

FaceApp terms of use

[...] And it's a good reason to be wary when any app wants access and a license to your digital content and/or identity.

As former Rackspace manager Rob La Gesse mentioned today:

To make FaceApp actually work, you have to give it permissions to access your photos - ALL of them. But it also gains access to Siri and Search .... Oh, and it has access to refreshing in the background - so even when you are not using it, it is using you.

Do recall we recently had a story here about "Deep Fakes": Deep Fakes Advance to Only Needing a Single Two Dimensional Photograph.

Also at Security Week.


Original Submission

Related Stories

Deep Fakes Advance to Only Needing a Single Two Dimensional Photograph 4 comments

Currently to get a realistic Deep Fake, shots from multiple angles are needed. Russian researchers have now taken this a step further, generating realistic video sequences based off a single photo.

Researchers trained the algorithm to understand facial features' general shapes and how they behave relative to each other, and then to apply that information to still images. The result was a realistic video sequence of new facial expressions from a single frame.

As a demonstration, they provide details and synthesized video sequences of historical figures such as Albert Einstein and Salvador Dali, as well as sequences based on paintings such as the Mona Lisa.

The authors are aware of the potential downsides of their technology and address this:

We realize that our technology can have a negative use for the so-called "deepfake" videos. However, it is important to realize, that Hollywood has been making fake videos (aka "special effects") for a century, and deep networks with similar capabilities have been available for the past several years (see links in the paper). Our work (and quite a few parallel works) will lead to the democratization of the certain special effects technologies. And the democratization of the technologies has always had negative effects. Democratizing sound editing tools lead to the rise of pranksters and fake audios, democratizing video recording lead to the appearance of footage taken without consent. In each of the past cases, the net effect of democratization on the World has been positive, and mechanisms for stemming the negative effects have been developed. We believe that the case of neural avatar technology will be no different. Our belief is supported by the ongoing development of tools for fake video detection and face spoof detection alongside with the ongoing shift for privacy and data security in major IT companies.

While it works with as few as one frame to learn from, the technology benefits in accuracy and 'identity preservation' from having multiple frames available. This becomes obvious when observing the synthesized Mona Lisa sequences, which, while accurate to the original, appear to be essentially three different individuals to the human eye watching them.

Journal Reference: https://arxiv.org/abs/1905.08233v1

Related Coverage
Most Deepfake Videos Have One Glaring Flaw: A Lack of Blinking
My Struggle With Deepfakes
Discord Takes Down "Deepfakes" Channel, Citing Policy Against "Revenge Porn"
AI-Generated Fake Celebrity Porn Craze "Blowing Up" on Reddit
As Fake Videos Become More Realistic, Seeing Shouldn't Always be Believing


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2, Funny) by Anonymous Coward on Friday July 19 2019, @06:16AM

    by Anonymous Coward on Friday July 19 2019, @06:16AM (#868854)

    *Kicks smartphone user into their own Facebook app*

  • (Score: 1, Interesting) by Anonymous Coward on Friday July 19 2019, @06:30AM (6 children)

    by Anonymous Coward on Friday July 19 2019, @06:30AM (#868856)

    Is all this outrage passed through the first world media because faceapp was developed by a Russian company?

    • (Score: 0) by Anonymous Coward on Friday July 19 2019, @06:43AM

      by Anonymous Coward on Friday July 19 2019, @06:43AM (#868857)

      So, it just proves 150 million people need some extra braincells, or to actually engage the few they have in their skulls.

    • (Score: 2) by gtomorrow on Friday July 19 2019, @07:47AM (3 children)

      by gtomorrow (2230) on Friday July 19 2019, @07:47AM (#868877)

      I would have thought it was because Company X now has a database of circa 150 MILLION FACES, regardless of the flag-waving. But what do I know?

      • (Score: 0) by Anonymous Coward on Friday July 19 2019, @09:14AM

        by Anonymous Coward on Friday July 19 2019, @09:14AM (#868891)

        You miss the point. The "outrage" in the western media does seem disproportionate.

        For example compare the number of articles for this:

        https://www.theguardian.com/world/2014/feb/27/gchq-nsa-webcam-images-internet-yahoo [theguardian.com]

        vs just the number of articles playing up the russia faceapp privacy issues.

        Yes "only" more than 1.8 million yahoo accounts were affected. But "demographics" of images you get from such webcam sessions are quite different from the images from faceapp. e.g. there'll be fewer sexually explicit or high blackmail potential images going to faceapp.

        I've already told friends that their photos go to Russia etc and so far _all_ of them went "So what?". But I'm pretty sure fewer of them would go "so what" if their webcam/messaging videos were being leaked.

      • (Score: 0) by Anonymous Coward on Friday July 19 2019, @03:29PM (1 child)

        by Anonymous Coward on Friday July 19 2019, @03:29PM (#869009)

        150 MILLION FACES

        Facefuck probably had more than that and the MSM was fine with it until they "took $50k of Russian ad money and caused Trump to get elected".

        • (Score: 0) by Anonymous Coward on Saturday July 20 2019, @12:08AM

          by Anonymous Coward on Saturday July 20 2019, @12:08AM (#869231)

          Almost no one I know voted FOR Trump. They were trying to vote against Hillary!

    • (Score: 3, Interesting) by Bot on Friday July 19 2019, @09:02AM

      by Bot (3902) on Friday July 19 2019, @09:02AM (#868888) Journal

      How is my nginx doing BTW? And the made in China router that can potentially mitm a lot of your traffic to your made in China phone tablet or pc?

      --
      Account abandoned.
  • (Score: 3, Funny) by aristarchus on Friday July 19 2019, @07:16AM

    by aristarchus (2645) on Friday July 19 2019, @07:16AM (#868866) Journal

    Quick gargle search for "deepfake aristarchus" produced this result:
    The Horror! [i.gzn.jp]

  • (Score: 5, Informative) by ledow on Friday July 19 2019, @07:47AM (1 child)

    by ledow (5567) on Friday July 19 2019, @07:47AM (#868876) Homepage

    I'm afraid I don't.

    I loaded the Android app from the Google Play Store.

    I gave no explicit consent. I didn't have to sign up for any account or agree to any terms and conditions.

    I'm inside the EU.

    Thus if my personal data (image or name that I typed in) appear *anywhere* or are used for *anything*, then that's a GDPR violation.

    The only "permission" I gave was to access the camera. That's it. That's not an explicit right to do anything with images recorded from it... far from it.

    I'd love the see the court case where they tried to argue that they can. The EU/UK courts would tear them apart.

    • (Score: 1, Interesting) by Anonymous Coward on Friday July 19 2019, @09:25AM

      by Anonymous Coward on Friday July 19 2019, @09:25AM (#868895)
      Doesn't the GDPR stuff only take effect if those involved ever stepped foot in the EU or tried to do business in the EU?

      Would an Interpol Red Notice be issued for such stuff or would the EU do extraordinary renditions for GDPR and similar violations?

      If these are unlikely then they can use your photos as they see fit as long as they are not breaking the laws of the countries they are in or want/need to visit.
  • (Score: 5, Interesting) by gtomorrow on Friday July 19 2019, @07:57AM

    by gtomorrow (2230) on Friday July 19 2019, @07:57AM (#868879)

    Every time I think to myself, "the future wasn't supposed to be like this," I read articles like this, do a facepalm, add the latest breach of privacy/trust to the neverending list and say "ah, yeah...now I remember why it is like this."

    "Jane, stop this crazy thing!"

  • (Score: 1) by jmichaelhudsondotnet on Friday July 19 2019, @08:57PM

    by jmichaelhudsondotnet (8122) on Friday July 19 2019, @08:57PM (#869163) Journal

    You should at least get a cut from the re-use of your own face.

    It's the whole giving your birthright away for a bowl of porridge thing.

    Clearly another dark pattern, exploiting vanity and the sheen of new tech to lure people into a trap.

    Can their face and likeness be used in an advertisement that they are going to see over and over for the next few years everywhere they go in public?

    Can it be sold to their greatest enemy for a 19.99 one month subscription to a service like cell phone records?

    The sci fi dystopian possibilities here are pretty wild. This is a mania, people will get hurt, corporations will win.

    Like Mr. Stallman says, it is difficult to defend people's rights when they are so hellbent on giving them away.

    Reminds me also of Pinnochio. And Requiem For a Dream.

(1)