Unreal’s new iPhone app does live motion capture with Face ID sensors:
Unreal Engine developer Epic Games has released Live Link Face, an iPhone app that uses the front-facing 3D sensors in the phone to do live motion capture for facial animations in 3D projects like video games, animations, or films.
The app uses tools from Apple's ARKit framework and the iPhone's TrueDepth sensor array to stream live motion capture from an actor looking at the phone to 3D characters in Unreal Engine running on a nearby workstation. It captures facial expressions as well as head and neck rotation.
Live Link Face can stream to multiple machines at once, and "robust timecode support and precise frame accuracy enable seamless synchronization with other stage components like cameras and body motion capture," according to Epic's blog post announcing the app. Users get a CSV of raw blendshape data and an MOV from the phone's front-facing video camera, with timecodes.
[...] For those not familiar, Unreal Engine began life as a graphics engine used by triple-A video game studios for titles like Gears of War and Mass Effect, and it has evolved over time to be used by indies and in other situations like filmmaking, architecture, and design. It competes with another popular engine called Unity, as well as in-house tools developed by various studios.
(Score: 2) by FatPhil on Saturday July 11 2020, @07:37AM (1 child)
It looks like it's just an animoji in a game context rather than a messaging context. How is that advancing the field? Next innovation - embed it in the music playing app so that you san see yourself sing in 3D along with the music? I'm a geniuose, I should patent that.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 2) by c0lo on Saturday July 11 2020, @07:56AM
My understanding, so I may be totally wrong
The difference is that the output of animoji is just an animated pile of poo (that maybe mimics your facial expression) while this one streams live motion capture data.
I assume animating a pile of poo requires less samples for mimicking something good enough, but one expects a lot more to have something realistic.
The fact that one use Unreals engine to morph MoCap into the face of one's character of choice is incidental.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 0) by Anonymous Coward on Monday July 13 2020, @09:31PM
It started life as the engine for 'Unreal' the game that came before Unreal Tournament as a competitor to Quake back in the 1990s. it competed with id's Tech engines and other middleware platforms like Gamebryo, but didn't do too well. Then around UE2 or UE3 it came into its own, being used for a variety of games, first on PC then moving to console. Now today it is used for both AAA games and realtime interactive scenes during the prototyping of TV and Movie scenes, as well as bringing real time cinematic quality graphics to the PS5/XBNext as well as the PC platform.
For some other examples of its use: Go look at any 3d generated anime from last year, the Altered Carbon Anime, The Mandalorian, and a few other major movies and tv shows of the past year. Gears of War is a comparative blip in Unreal Engine's rise to glory, as was its use by 'independents', which probably has more to do with projects like the Torque Engine going open source and pushing Unreal and Unity to be more flexible to keep indie developers interested so their big budget demand didn't erode, like the former middleware solutions they are rapidly replacing.