Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Saturday March 24 2018, @06:17PM   Printer-friendly
from the Adam-Selene dept.

Epic Games' Tim Sweeney on creating believable digital humans

Epic Games stunned everyone a couple of years ago with the realistic digital human character Senua, from the video game Hellblade. And today, the maker of the Unreal Engine game tools showed another astounding demo, dubbed Siren, with even more realistic graphics.

CEO Tim Sweeney said technologies for creating digital humans — from partners such as Cubic Motion and 3Lateral — are racing ahead to the point where we won't be able to tell the real from the artificial in video games and other real-time content.

[...] [Kim Libreri:] The other big thing for us, you may have seen the Microsoft announcements about their new raytracing capabilities in DirectX, DXR. We've partnered with Nvidia, who have the new RTX raytracing system, and we thought about how to show the world what a game could look like in the future once raytracing is added to the core capabilities of a PC, or maybe even a console one day. We teamed up with Nvidia and our friends at LucasFilm, the ILM X-Lab, to make a short film that demonstrates the core capabilities of raytracing in Unreal Engine. It's an experimental piece, but it shows the kind of features we'll add to the engine over the next year or so.

We've added support for what we call textured area lights, which is the same way we would light movies. You can see multiple reflections. You can see on the character, when she's carrying her gun, the reflection of the back of the gun in her chest plate. It's running on an Nvidia DGX-1, which is a four-GPU graphics computer they make. But as you know, hardware gets better every year. Hopefully one day there's a machine that can do this for gamers as well as high-end professionals. It's beginning to blur the line between what a movie looks like and what a game can look like. We think there's an exciting time ahead.

One thing we've been interested in over the years is digital humans. Two years ago we showed Senua, the Hellblade character. To this day, that's pretty much state of the art. But we wanted to see if we could get closer to crossing the uncanny valley. She was great, but you could see that the facial animation wasn't quite there. The details in the skin and the hair—it was still a fair way from crossing the uncanny valley.

Video is available on YouTube: Siren, alone (42s) and Siren Behind The Scenes (52s), and Creating Believable Characters in Unreal Engine (56m31s).

Related: Microsoft Announces Directx 12 Raytracing API


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by KritonK on Sunday March 25 2018, @12:16PM (3 children)

    by KritonK (465) on Sunday March 25 2018, @12:16PM (#657871)

    So, we have a very expensive way of shooting an actress and projecting her image on screen, by digitizing her face and movements, applying these data on a digital model and rendering it in real time. I could have done this a lot cheaper by simply putting a red dress on the actress and shooting her with a camera.

    Wake me up when a computer can generate all those subtle movements and expressions on its own, without needing to copy them from a human.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Wootery on Monday March 26 2018, @11:06AM (2 children)

    by Wootery (2341) on Monday March 26 2018, @11:06AM (#658356)

    and rendering it in real time

    I could have done this a lot cheaper by simply putting a red dress on the actress and shooting her with a camera.

    So... not even close to the same thing, then.

    Wake me up when a computer can generate all those subtle movements and expressions on its own, without needing to copy them from a human.

    So you don't want to be woken up when real-time graphics (in, say, video games) look as realistic as the video? All that tells us is that you aren't interested in computer graphics.

    Fair enough, but don't pretend that's some kind of great insight into their achievement.

    • (Score: 2) by KritonK on Monday March 26 2018, @02:11PM (1 child)

      by KritonK (465) on Monday March 26 2018, @02:11PM (#658426)

      What they did is just a very complicated way of pointing a camera at a person and reproducing what the camera "sees", given that the end result is the actress appearing on screen. The fact that they decomposed the actress into zillions of bits of information and then recomposed them, does not make it intrinsically different from directly recording the data from the camera's sensor and reproducing these data on screen, with minimal intervention. Only more complicated.

      This is fine, if you want to do this the hard way, just for the sake of doing it, especially if, as you say, you are interested in computer graphics, but this isn't of much use at the moment. One might argue that this is preparation for the day when computers will actually be able to produce all those subtle human movements and gestures, without needing to copy them from a human, but this is the hard part.

      • (Score: 2) by Wootery on Monday March 26 2018, @02:32PM

        by Wootery (2341) on Monday March 26 2018, @02:32PM (#658440)

        that they decomposed the actress into zillions of bits of information and then recomposed them, does not make it intrinsically different from directly recording the data from the camera's sensor and reproducing these data on screen

        Of course it does. You're aware that the field of computer graphics is a thing that exists, right?

        this isn't of much use at the moment

        It's a tech demo...

        One might argue that this is preparation for the day when computers will actually be able to produce all those subtle human movements and gestures, without needing to copy them from a human, but this is the hard part.

        And you could have used the same nonsensical reasoning to dismiss the DOOM engine, and the Quake engine, all the way down the line to the modern day...

        Are you trolling, or are you unable to comprehend that a computer graphics tech demo might, you know, demonstrate a new computer graphics technology?