Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Saturday March 24 2018, @06:17PM   Printer-friendly
from the Adam-Selene dept.

Epic Games' Tim Sweeney on creating believable digital humans

Epic Games stunned everyone a couple of years ago with the realistic digital human character Senua, from the video game Hellblade. And today, the maker of the Unreal Engine game tools showed another astounding demo, dubbed Siren, with even more realistic graphics.

CEO Tim Sweeney said technologies for creating digital humans — from partners such as Cubic Motion and 3Lateral — are racing ahead to the point where we won't be able to tell the real from the artificial in video games and other real-time content.

[...] [Kim Libreri:] The other big thing for us, you may have seen the Microsoft announcements about their new raytracing capabilities in DirectX, DXR. We've partnered with Nvidia, who have the new RTX raytracing system, and we thought about how to show the world what a game could look like in the future once raytracing is added to the core capabilities of a PC, or maybe even a console one day. We teamed up with Nvidia and our friends at LucasFilm, the ILM X-Lab, to make a short film that demonstrates the core capabilities of raytracing in Unreal Engine. It's an experimental piece, but it shows the kind of features we'll add to the engine over the next year or so.

We've added support for what we call textured area lights, which is the same way we would light movies. You can see multiple reflections. You can see on the character, when she's carrying her gun, the reflection of the back of the gun in her chest plate. It's running on an Nvidia DGX-1, which is a four-GPU graphics computer they make. But as you know, hardware gets better every year. Hopefully one day there's a machine that can do this for gamers as well as high-end professionals. It's beginning to blur the line between what a movie looks like and what a game can look like. We think there's an exciting time ahead.

One thing we've been interested in over the years is digital humans. Two years ago we showed Senua, the Hellblade character. To this day, that's pretty much state of the art. But we wanted to see if we could get closer to crossing the uncanny valley. She was great, but you could see that the facial animation wasn't quite there. The details in the skin and the hair—it was still a fair way from crossing the uncanny valley.

Video is available on YouTube: Siren, alone (42s) and Siren Behind The Scenes (52s), and Creating Believable Characters in Unreal Engine (56m31s).

Related: Microsoft Announces Directx 12 Raytracing API


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Touché) by Captival on Saturday March 24 2018, @07:23PM (5 children)

    by Captival (6866) on Saturday March 24 2018, @07:23PM (#657639)

    The uncanny valley valley, where people reflexively have to find something wrong with any CGI, even to the point of imagining it, and post about it to show how smart they are.

    Starting Score:    1  point
    Moderation   +1  
       Touché=1, Total=1
    Extra 'Touché' Modifier   0  

    Total Score:   2  
  • (Score: 2) by JoeMerchant on Saturday March 24 2018, @08:05PM (1 child)

    by JoeMerchant (3937) on Saturday March 24 2018, @08:05PM (#657653)

    Polar Express had me fooled long enough that I was questioning whether it was live action with CGI post manipulations, for at least 30 seconds.

    That feeling when you transition from being fooled to not being fooled is what makes the uncanny valley uncomfortable. There are literally millions of potential cues per minute, once enough pile up to put you in the questioning state, it's just weird there. Once you're clearly on the Pixar side of the Universe, you can get back to accepting things that talk but shouldn't, and enjoy the story.

    --
    🌻🌻 [google.com]
  • (Score: 0) by Anonymous Coward on Saturday March 24 2018, @09:00PM (1 child)

    by Anonymous Coward on Saturday March 24 2018, @09:00PM (#657666)

    That one was pretty good. Unfortunatly for me the compression artifacts messed with the video. Lip sync was slightly off and the mouth was a bit odd. But other than that pretty good.

    • (Score: 2) by Rivenaleem on Monday March 26 2018, @11:00AM

      by Rivenaleem (3400) on Monday March 26 2018, @11:00AM (#658354)

      "The mouth is odd" The problem with the mouth can be easily seen in the behind the scenes video, and it's mostly down to the tongue. The tongue is not easily captured, so it disappears in the digital version. In the first clip you only see a tiny bit of it for only a fraction of a second. But in the behind the scenes you can see how there's more of it visible, and light reflections off her teeth, tongue and gums/cheeks. This is all missing from the digital which makes talking models more likely to throw you off.

  • (Score: 0) by Anonymous Coward on Sunday March 25 2018, @01:23AM

    by Anonymous Coward on Sunday March 25 2018, @01:23AM (#657741)

    The uncanny valley valley, where people reflexively have to find something wrong with any CGI, even to the point of imagining it, and post about it to show how smart they are.

    You must mean someone who is a Pixel Pedant.