Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday January 21 2019, @05:37AM   Printer-friendly
from the when-you-want-a-top-end-video-card-for-research dept.

Q2VKPT Is the First Entirely Raytraced Game with Fully Dynamic Real-Time Lighting, Runs 1440P@60FPS with RTX 2080Ti via Vulkan API

Q2VKPT [is] an interesting graphics research project whose goal is to create the first entirely raytraced game with fully dynamic real-time lighting, based on the Quake II engine Q2PRO. Rasterization is used only for the 2D user interface (UI).

Q2VKPT is powered by the Vulkan API and now, with the release of the GeForce RTX graphics cards capable of accelerating ray tracing via hardware, it can get close to 60 frames per second at 1440p (2560×1440) resolution with the RTX 2080 Ti GPU according to project creator Christoph Schied.

The project consists of about 12K lines of code which completely replace the graphics code of Quake II. It's open source and can be freely downloaded via GitHub.

This is how path tracing + denoising (4m16s video) works.

Also at Phoronix.

Related: Nvidia Announces Turing Architecture With Focus on Ray-Tracing and Lower-Precision Operations
Nvidia Announces RTX 2080 Ti, 2080, and 2070 GPUs, Claims 25x Increase in Ray-Tracing Performance


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by ledow on Monday January 21 2019, @09:51AM (2 children)

    by ledow (5567) on Monday January 21 2019, @09:51AM (#789494) Homepage

    And like all these things, since the days of Tenebrae Quake, etc.

    I think I'd rather have the non-raytraced version, that runs at a decent speed on sensible hardware.

    I compare it to the way that we don't have movie cameras that you can just take into a building and "film" without studio lights, metering, reflectors, etc.

    Sure, it's "less realistic". You're filming people wearing make-up, under really bright lights, in a controlled set. But it makes a better movie than someone with even the most expensive "human-eye-like" camera just filming in a room made to be lit exactly as the script says (e.g. by a single candle or flaming torch or whatever).

    The reason we put lights and things and falsify the lighting is that it makes both the film-maker's and the audience's life easier and experience better. The reason that GTA V consists of hundreds of hand-crafted shaders and multi-pass graphics (there's an article somewhere about it - the work that goes into packing information into spare pixel channels, and the shader-work outside the scope of simply drawing something on the screen is amazing), is that to ray-trace it would not only make it much more intensive on the audience (and a single dev taking a year to optimise a shader is "better" for you than everyone getting a game that runs like a stunned sloth), but it would make it much harder to actually make it look right (because you would have to replicate a real environment and all those explosions and bright lights wouldn't look how you would want them to do in a game), but also that it would then just look like "a real thing". Which is almost certainly not the effect you are after in a movie or video game, no matter what you might think. It would also take as much - if not more - effort to do so, than just licensing an engine or even adjusting/creating your own to do so.

    Your artists would suddenly be just model-makers and have to include the right wattage of lighting all the way down the entire underground level or you're just in a black pit and can't see anything (remember Doom?). They'd have to colour everything accordingly and spend just as much time making it work and be realistic as they would have done just saying "Look, can we tint the floor here so the player doesn't notice but so they can read the vital plot "scrawl of blood" on the floor?".

    Ray-tracing has been around forever. The early ray-tracing demos now just run on modern PC's. Not only are there lots of problems with them, but not one company has ever seriously made a ray-traced game. The effort involved gains them nothing. That Quake 2 demos looks just like a slightly-pretty Quake 2 to me. Though I can see what I'm supposed to notice, shadows, etc. what I don't see is anything that a shader couldn't do ten times faster even if it's not perfectly accurate.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by pkrasimirov on Monday January 21 2019, @10:04AM

    by pkrasimirov (3358) Subscriber Badge on Monday January 21 2019, @10:04AM (#789499)

    What's "realistic" is defined when the NN is trained. Once in grasp of the "reality", it can be made to an insane amount of HDR or pastel colors or just gamma or whatever.

  • (Score: 2) by takyon on Monday January 21 2019, @05:21PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday January 21 2019, @05:21PM (#789659) Journal

    This is another tool in the toolbox. Some will use it to great effect, others will use it sloppily (see HDR [resetera.com]).

    Your artists would suddenly be just model-makers and have to include the right wattage of lighting all the way down the entire underground level or you're just in a black pit and can't see anything (remember Doom?). They'd have to colour everything accordingly and spend just as much time making it work and be realistic as they would have done just saying "Look, can we tint the floor here so the player doesn't notice but so they can read the vital plot "scrawl of blood" on the floor?".

    Artists are beginning to use machine learning to cut down on their workload. It could be applicable to the menial tasks you mention. Aspects such as level design should be even more important than visual design and will probably require the human touch, at least for now.

    I think I'd rather have the non-raytraced version, that runs at a decent speed on sensible hardware.

    What is decent performance? 60 FPS? 90? 120? 240? At 4K resolution? 16K?

    https://soylentnews.org/article.pl?sid=18/12/02/232213 [soylentnews.org]
    https://www.darpa.mil/attachments/3DSoCProposersDay20170915.pdf [darpa.mil]

    With new transistor types and 3D architectures, we could see such a huge performance increase that we will be able to do real-time raytracing, at 240 FPS, 16K resolution [soylentnews.org], in a 1 Watt chip used in a VR headset. (Of course, foveated rendering could reduce the burden for a headset chip considerably.) But long before then, this path tracing approach will work.

    AMD was right to wait and see on real-time raytracing. However, while Nvidia's current hardware is overhyped early adopter stuff, they will be able to massively increase the performance in subsequent generations. Just the shrink to TSMC "7nm" alone will give them a lot of extra die space that could be used by dedicated raytracing/tensor cores.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]