Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday December 10 2017, @09:57PM   Printer-friendly
from the ontology-in-VR dept.

Google Is Building A New Foveation Pipeline For Future XR Hardware

Google's R&D arm, Google Research, recently dedicated some time and resources to discovering ways to improve the performance of foveated rendering. Foveated rendering already promises vast performance improvements compared to full-resolution rendering. However, Google believes that it can do even better. The company identified three elements that could be improved, and it proposed three solutions that could potentially solve the problems, including two new foveation techniques and a reworked rendering pipeline.

Foveated rendering is a virtual reality technique that uses eye tracking to reduce the amount of image quality necessary in areas covered by the peripheral vision.

The new techniques mentioned are Phase-Aligned Rendering and Conformal Rendering.

Also at Google's Research Blog.

Related: Oculus VR Founder Palmer Luckey on the Need for "Unlimited Graphics Horsepower"
Google Implements Equi-Angular Cubemaps Technique for Better VR Quality
Oculus Research Presents Focal Surface Display. Will Eliminate Nausea in VR
Virtual Reality Audiences Stare Straight Ahead 75% of the Time


Original Submission

Related Stories

Oculus VR Founder Palmer Luckey on the Need for "Unlimited Graphics Horsepower" 34 comments

Tom's Hardware conducted an interview with Palmer Luckey, the founder of Oculus VR. The defining takeaway? Virtual reality needs as much graphics resources as can be thrown at it:

Tom's Hardware: If there was one challenge in VR that you had to overcome that you really wish wasn't an issue, which would it be?

Palmer Luckey: Probably unlimited GPU horsepower. It is one of the issues in VR that cannot be solved at this time. We can make our hardware as good as we want, our optics as sharp as we can, but at the end of the day we are reliant on how many flops the GPU can push, how high a framerate can it push? Right now, to get 90 frames per second [the minimum target framerate for Oculus VR] and very low latencies we need heaps of power, and we need to bump the quality of the graphics way down.

If we had unlimited GPU horsepower in everybody's computer, that will make our lives very much easier. Of course, that's not something we can control, and it's a problem that will be solved in due time.

TH: Isn't it okay to deal with the limited power we have today, because we're still in the stepping stones of VR technology?

PL: It's not just about the graphics being simple. You can have lots of objects in the virtual environment, and it can still cripple the experience. Yes, we are able to make immersive games on VR with simpler graphics on this limited power, but the reality is that our ability to create what we are imagining is being limited by the limited GPU horsepower.

[...] The goal in the long run is not only to sell to people who buy game consoles, but also to people who buy mobile phones. You need to expand so that you can connect hundreds of millions of people to VR. It may not necessarily exist in the form of a phone dropping into a headset, but it will be mobile technologies -- mobile CPUs, mobile graphics cards, etc.

In the future, VR headsets are going to have all the render hardware on board, no longer being hardwired to a PC. A self-contained set of glasses is a whole other level of mainstream.

[More after the Break]

Google Implements Equi-Angular Cubemaps Technique for Better VR Quality 2 comments

Google has described a projection technique called Equi-Angular Cubemaps that can improve VR/360° video quality:

The image quality issue that plagues spherical video sources is a perspective problem. A traditional 2D video records a predetermined frame and is played back on a display of the same shape and perspective. VR/360-degree video doesn't offer that luxury. Although you can choose which perspective you wish to see, your view is always a fixed resolution and shape, no matter where you look. The image is technically flat, which means the source feed must be warped to fit a flat plane. The process is comparable to what cartographers go through when trying to map the globe to a flat surface. In order to fit the spherical earth map onto a flat image, the perspective must be altered.

[...] Google's Daydream and YouTube engineers came up with a new projection technique called Equi-Angular Cubemaps (EAC) that offers less disruptive image degradation. EAC keeps the pixel count even between cubemap samples, which produces balanced image quality across the board. [...] Google is already putting Equi-Angular Cubemap projection to work. Spherical video playback from YouTube with EAC support is now available on Android devices, and Google said support for iOS and desktop is coming soon. If you want to know more about EAC, Google's blog offers a deeper explanation of the technology, and the YouTube Engineering and Developers blog has additional details.


Original Submission

Oculus Research Presents Focal Surface Display. Will Eliminate Nausea in VR 18 comments

Focal surface displays mimic the way our eyes naturally focus at objects of varying depths. Rather than trying to add more and more focus areas to get the same degree of depth, this new approach changes the way light enters the display using spatial light modulators (SLMs) to bend the headset's focus around 3D objects—increasing depth and maximizing the amount of space represented simultaneously.

All of this adds up to improved image sharpness and a more natural viewing experience in VR.

"Quite frankly, one of the reasons this project ran as long as it did is that we did a bunch of things wrong the first time around," jokes Research Scientist Fix. "Manipulating focus isn't quite the same as modulating intensity or other more usual tasks in computational displays, and it took us a while to get to the correct mathematical formulation that finally brought everything together. Our overall motivation was to do things the 'right' way—solid engineering combined with the math and algorithms to back it up. We weren't going to be happy with something that only worked on paper or a hacked together prototype that didn't have any rigorous explanation of why it worked."

The paper (PDF).

-- submitted from IRC


Original Submission

Virtual Reality Audiences Stare Straight Ahead 75% of the Time 44 comments

YouTube's revealed the secret to making an engaging virtual reality video: put the best parts right in front of the audience so they don't have to move their heads.

Google's video vault offers that advice on the basis of heat maps it's created based on analysis of where VR viewers point their heads while wearing VR goggles. There's just such a heat map at the top of this story (or here for m.reg readers) and a bigger one here.

The many heat maps YouTube has made lead it to suggest that VR video creators "Focus on what's in front of you: The defining feature of a 360-degree video is that it allows you to freely look around in any direction, but surprisingly, people spent 75% of their time within the front 90 degrees of a video. So don't forget to spend significant time on what's in front of the viewer."

YouTube also advises that "for many of the most popular VR videos, people viewed more of the full 360-degree space with almost 20% of views actually being behind them." Which sounds to El Reg like VR viewers are either staring straight ahead, or looking over their shoulders with very little time being devoted to sideways glances.

A video channel wants people to treat VR like video. Hmmm. Perhaps the answer to their question is in the question: people should be considered "participants" instead of an "audience."


Original Submission

Varjo VR-1 Headset Uses a Central, Fixed-Foveated Display 5 comments

A new, business-oriented VR headset uses a tiny, high resolution display panel within a larger panel in order to display very high quality imagery to users looking straight ahead:

The VR-1 calls its center panel a "Bionic Display." It's a 1920 x 1080 "micro-OLED" display with a resolution of 3,000 pixels per inch. (For context, last year's high-resolution prototype display from Google and LG had 1443 ppi.) Within that central strip, images are supposed to roughly match the resolution of the human eye. As Ars Technica, which checked out the headset, puts it, that section looks "every bit as detailed as real life." Outside that super crisp panel, there's a 1440 x 1600 display that produces images of more average quality.

The VR-1's total 87-degree field of view is smaller than that of the Oculus Rift or HTC Vive, let alone the 200 degrees offered by something like Pimax's more experimental VR headset. The Bionic Display only comprises a slice of it. Ars Technica describes great image quality while you're looking straight ahead, with a noticeable downgrade outside that. And rendering that high-resolution slice requires more processing power than you'd need for average VR headsets, which are already fairly demanding.

[...] The VR-1 uses standard SteamVR base stations for tracking, and it supports both the Unity and Unreal engines, so you could theoretically play games or use other consumer software. But the headset isn't priced for consumers. It costs $5,995 with an annual service fee of $995, and Varjo stresses that it's "only available for businesses and academic institutions." The company is already working with Airbus, Audi, Saab, Volkswagen, and Volvo, among others.

Human eye - Field of view.

Also at Road to VR.

Related: Virtual Reality Audiences Stare Straight Ahead 75% of the Time
Google Research Proposes New Foveated Rendering Techniques for VR
Google and LG to Show Off World's Highest Resolution OLED-on-Glass Display in May


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by Gaaark on Sunday December 10 2017, @10:12PM (4 children)

    by Gaaark (41) Subscriber Badge on Sunday December 10 2017, @10:12PM (#608077) Journal

    All that, and all I read was "More PATENTS! Yeah!"

    --
    --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
    • (Score: 0) by Anonymous Coward on Monday December 11 2017, @02:04AM (3 children)

      by Anonymous Coward on Monday December 11 2017, @02:04AM (#608147)

      So in 20 years we can use these techniques ourselves.

      Personally I just need regular (and cheap!) VR headsets, supporting basic tracking and at least 2160p per eye (ideally with quality scaling/sampling features so I can run it at lower resolution with older hardware.)

      I mostly want them for johnny mnemonic style virtual navigation, augmented reality (how about some headsets with at least two cameras on the front?!?!?!) and used as a means of 'blacking out' offscreen material while using webcam probes or microscopes on low visibility subject matter.

      • (Score: 2) by c0lo on Monday December 11 2017, @02:34AM (2 children)

        by c0lo (156) on Monday December 11 2017, @02:34AM (#608161) Journal

        So in 20 years we can use these techniques ourselves.

        In 20 years time it's likely those techniques will be deprecated.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0
        • (Score: 2) by MichaelDavidCrawford on Monday December 11 2017, @04:42AM (1 child)

          by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Monday December 11 2017, @04:42AM (#608194) Homepage Journal

          and it's still in widespread use.

          So is the transistor. Perhaps you're familiar with it.

          --
          Yes I Have No Bananas. [gofundme.com]
          • (Score: 3, Interesting) by c0lo on Monday December 11 2017, @07:30AM

            by c0lo (156) on Monday December 11 2017, @07:30AM (#608219) Journal

            So is the transistor. Perhaps you're familiar with it.

            And your point is...? No, seriously, I'm curious.
            Because you'll have to make serious effort to find consumer electronics which nowadays use individual transistors (the way they were patented).

            (my point was: the "foveated rendering" is a technique to get around the limitations of current hardware.
            In 20 years time, it is highly likey the hardware will be sufficiently performative to allow the simplification of the VR headset, eliminating the need of "eye tracker" and simplifying the gizmo. The simpler, the more robust, the lower the cost).

            --
            https://www.youtube.com/watch?v=aoFiw2jMy-0
  • (Score: 3, Interesting) by Ethanol-fueled on Sunday December 10 2017, @10:33PM (2 children)

    by Ethanol-fueled (2792) on Sunday December 10 2017, @10:33PM (#608085) Homepage

    That's a lot like how the real eye works (from a perceptual standpoint, not a physical one), peripheral vision and whatnot. It's not nearly as mind-blowing as occlusion culling, which has been around for awhile is always a good trivia tidbit when gayming with non-technical types.

    • (Score: 3, Informative) by wonkey_monkey on Sunday December 10 2017, @10:54PM (1 child)

      by wonkey_monkey (279) on Sunday December 10 2017, @10:54PM (#608089) Homepage

      That's a lot like how the real eye works (from a perceptual standpoint, not a physical one)

      It's both, and it's not just "a lot like." It's the whole reason foveated rendering is a thing in the first place.

      This is like looking at a cross-trainer and saying "hey, the way those thingies go up and down is a lot like how people's feet move when they run!"

      It's not nearly as mind-blowing as occlusion culling

      You must have quite a low mind-blow threshold.

      --
      systemd is Roko's Basilisk
      • (Score: 1) by Ethanol-fueled on Sunday December 10 2017, @10:58PM

        by Ethanol-fueled (2792) on Sunday December 10 2017, @10:58PM (#608092) Homepage

        Not really. Blurring or otherwise reducing the quality of peripheral vision is a lot more understandable to non-technical types than the explanation that there is some magical black hole netherworld behind everything that is obstructing their vision.

  • (Score: 2, Funny) by Anonymous Coward on Sunday December 10 2017, @11:05PM

    by Anonymous Coward on Sunday December 10 2017, @11:05PM (#608093)

    google will build a database of my eye movements: "citizen unit #666, STOP looking at the boobies!"

  • (Score: 0) by Anonymous Coward on Sunday December 10 2017, @11:09PM (3 children)

    by Anonymous Coward on Sunday December 10 2017, @11:09PM (#608095)

    How will this improve my p0rn-viewing experience? Lets be honest here, it's all that VR is good for at the moment.

    • (Score: 3, Informative) by takyon on Sunday December 10 2017, @11:16PM (2 children)

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Sunday December 10 2017, @11:16PM (#608102) Journal

      It will render the boobies or poooooooosy you are intently focusing on, discarding unnecessary details like the lamp in the background that would only lower your "FPS".

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 3, Funny) by Anonymous Coward on Monday December 11 2017, @12:15AM

        by Anonymous Coward on Monday December 11 2017, @12:15AM (#608124)

        There is a lamp in there ?

      • (Score: 4, Funny) by rts008 on Monday December 11 2017, @12:41AM

        by rts008 (3001) on Monday December 11 2017, @12:41AM (#608131)

        ...like the lamp in the background that would only lower your "FPS".

        The Lamp!!! Now I know why it makes you go blind!
        *note to self: find sunglasses*

        'FPS', Faps Per Second?

        Double clutching can also lower your FPS. ;-)

  • (Score: 0) by Anonymous Coward on Monday December 11 2017, @02:09AM (1 child)

    by Anonymous Coward on Monday December 11 2017, @02:09AM (#608148)

    That was maybe 5 years ago.

    Now? He does cyberwar. It's more patriotic and the employment is more reliable.

    Actually, I know several game developers doing cyberwar, including one that is Wikipedia-famous.

    It could have something to do with game developer supply and demand.

(1)