This Open-Source Program Deepfakes You During Zoom Meetings, in Real Time:
Video conferencing apps like Zoom and Skype are usually boring and often frustrating. With more people than ever using this software to work from home, users are finding new ways to spice up endless remote meetings and group hangs by looping videos of themselves looking engaged, adding wacky backgrounds, and now, using deepfake filters for impersonating celebrities when you're tired of your own face staring back at you in the front-facing camera window.
Avatarify is a program that superimposes someone else's face onto yours in real-time, during video meetings. The code is available on Github for anyone to use.
Programmer Ali Aliev used the open-source code from the "First Order Motion Model for Image Animation," published on the arxiv preprint server earlier this year, to build Avatarify. First Order Motion, developed by researchers at the University of Trento in Italy as well as Snap, Inc., drives a photo of a person using a video of another person—such as footage of an actor—without any prior training on the target image.
With other face-swap technologies, like deepfakes, the algorithm is trained on the face you want to swap, usually requiring several images of the person's face you're trying to animate. This model can do it in real-time, by training the algorithm on similar categories of the target (like faces).
"I ran [the First Order Model] on my PC and was surprised by the result. What's important, it worked fast enough to drive an avatar real-time," Aliev told Motherboard. "Developing a prototype was a matter of a couple of hours and I decided to make fun of my colleagues with whom I have a Zoom call each Monday. And that worked. As they are all engineers and researchers, the first reaction was curiosity and we soon began testing the prototype."
(Score: 4, Interesting) by bradley13 on Wednesday April 22 2020, @10:53AM (2 children)
In the course of Corona, I've gotten a lot more spam than usual. At least one piece of it was incredibly well faked - it genuinely looked like it was from my employer, right up until I did the "view source", because the subject seemed...off. I don't want to know how many non-IT-savvy employees just clicked on the link it contained. Phishing attacks are getting very sophisticated.
So, think of those scams where an email, spoofed to be from the boss, tells the accounting dept. to transfer $millions to Outer Elbonia. Now send a videoconference link and deep-fake the boss. How much more believable will that be?
Real-time deep-fakes will bring some really challenging security issues...
Everyone is somebody else's weirdo.
(Score: 4, Insightful) by crafoo on Wednesday April 22 2020, @11:22AM (1 child)
People might actually start using digital signatures if this continues.
(Score: 3, Interesting) by DannyB on Wednesday April 22 2020, @02:50PM
You beat me to it. That is exactly right. I don't think they are going to deep fake a digital signature.
People are the weak link. That is what deep fakes and scams would target.
The target will then become the boss who authorized the action that defrauds their own business. Is it possible for the bosses to become smart enough to recognize a ruse?
The bosses could rely on employees to recognize the ruse, and use a digital signature for the employee to sign off that this request appears to be genuine and not some clever fake.
Then employees who try to recognize these fakes will become the social engineering targets.
So they will need an authorization from their boss.
But then . . .
Don't put a mindless tool of corporations in the white house; vote ChatGPT for 2024!