Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 11 submissions in the queue.
posted by janrinok on Wednesday April 22 2020, @01:53AM   Printer-friendly
from the now-get-the-voice-right dept.

This Open-Source Program Deepfakes You During Zoom Meetings, in Real Time:

Video conferencing apps like Zoom and Skype are usually boring and often frustrating. With more people than ever using this software to work from home, users are finding new ways to spice up endless remote meetings and group hangs by looping videos of themselves looking engaged, adding wacky backgrounds, and now, using deepfake filters for impersonating celebrities when you're tired of your own face staring back at you in the front-facing camera window.

Avatarify is a program that superimposes someone else's face onto yours in real-time, during video meetings. The code is available on Github for anyone to use.

Programmer Ali Aliev used the open-source code from the "First Order Motion Model for Image Animation," published on the arxiv preprint server earlier this year, to build Avatarify. First Order Motion, developed by researchers at the University of Trento in Italy as well as Snap, Inc., drives a photo of a person using a video of another person—such as footage of an actor—without any prior training on the target image.

With other face-swap technologies, like deepfakes, the algorithm is trained on the face you want to swap, usually requiring several images of the person's face you're trying to animate. This model can do it in real-time, by training the algorithm on similar categories of the target (like faces).

"I ran [the First Order Model] on my PC and was surprised by the result. What's important, it worked fast enough to drive an avatar real-time," Aliev told Motherboard. "Developing a prototype was a matter of a couple of hours and I decided to make fun of my colleagues with whom I have a Zoom call each Monday. And that worked. As they are all engineers and researchers, the first reaction was curiosity and we soon began testing the prototype."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by DannyB on Wednesday April 22 2020, @02:50PM

    by DannyB (5839) Subscriber Badge on Wednesday April 22 2020, @02:50PM (#985737) Journal

    You beat me to it. That is exactly right. I don't think they are going to deep fake a digital signature.

    People are the weak link. That is what deep fakes and scams would target.

    The target will then become the boss who authorized the action that defrauds their own business. Is it possible for the bosses to become smart enough to recognize a ruse?

    The bosses could rely on employees to recognize the ruse, and use a digital signature for the employee to sign off that this request appears to be genuine and not some clever fake.

    Then employees who try to recognize these fakes will become the social engineering targets.

    So they will need an authorization from their boss.

    But then . . .

    --
    Poverty exists not because we cannot feed the poor, but because we cannot satisfy the rich.
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3