Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by janrinok on Wednesday April 22 2020, @01:53AM   Printer-friendly
from the now-get-the-voice-right dept.

This Open-Source Program Deepfakes You During Zoom Meetings, in Real Time:

Video conferencing apps like Zoom and Skype are usually boring and often frustrating. With more people than ever using this software to work from home, users are finding new ways to spice up endless remote meetings and group hangs by looping videos of themselves looking engaged, adding wacky backgrounds, and now, using deepfake filters for impersonating celebrities when you're tired of your own face staring back at you in the front-facing camera window.

Avatarify is a program that superimposes someone else's face onto yours in real-time, during video meetings. The code is available on Github for anyone to use.

Programmer Ali Aliev used the open-source code from the "First Order Motion Model for Image Animation," published on the arxiv preprint server earlier this year, to build Avatarify. First Order Motion, developed by researchers at the University of Trento in Italy as well as Snap, Inc., drives a photo of a person using a video of another person—such as footage of an actor—without any prior training on the target image.

With other face-swap technologies, like deepfakes, the algorithm is trained on the face you want to swap, usually requiring several images of the person's face you're trying to animate. This model can do it in real-time, by training the algorithm on similar categories of the target (like faces).

"I ran [the First Order Model] on my PC and was surprised by the result. What's important, it worked fast enough to drive an avatar real-time," Aliev told Motherboard. "Developing a prototype was a matter of a couple of hours and I decided to make fun of my colleagues with whom I have a Zoom call each Monday. And that worked. As they are all engineers and researchers, the first reaction was curiosity and we soon began testing the prototype."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by bradley13 on Wednesday April 22 2020, @10:53AM (2 children)

    by bradley13 (3053) on Wednesday April 22 2020, @10:53AM (#985707) Homepage Journal

    In the course of Corona, I've gotten a lot more spam than usual. At least one piece of it was incredibly well faked - it genuinely looked like it was from my employer, right up until I did the "view source", because the subject seemed...off. I don't want to know how many non-IT-savvy employees just clicked on the link it contained. Phishing attacks are getting very sophisticated.

    So, think of those scams where an email, spoofed to be from the boss, tells the accounting dept. to transfer $millions to Outer Elbonia. Now send a videoconference link and deep-fake the boss. How much more believable will that be?

    Real-time deep-fakes will bring some really challenging security issues...

    --
    Everyone is somebody else's weirdo.
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 4, Insightful) by crafoo on Wednesday April 22 2020, @11:22AM (1 child)

    by crafoo (6639) on Wednesday April 22 2020, @11:22AM (#985711)

    People might actually start using digital signatures if this continues.

    • (Score: 3, Interesting) by DannyB on Wednesday April 22 2020, @02:50PM

      by DannyB (5839) Subscriber Badge on Wednesday April 22 2020, @02:50PM (#985737) Journal

      You beat me to it. That is exactly right. I don't think they are going to deep fake a digital signature.

      People are the weak link. That is what deep fakes and scams would target.

      The target will then become the boss who authorized the action that defrauds their own business. Is it possible for the bosses to become smart enough to recognize a ruse?

      The bosses could rely on employees to recognize the ruse, and use a digital signature for the employee to sign off that this request appears to be genuine and not some clever fake.

      Then employees who try to recognize these fakes will become the social engineering targets.

      So they will need an authorization from their boss.

      But then . . .

      --
      To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.