Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday March 06 2018, @12:38PM   Printer-friendly
from the picking-up-steam dept.

The Rift now represents about 47 percent of all VR headset users on Steam, according to the survey, sneaking just past the Vive at about 45 percent. Microsoft's Windows Mixed Reality initiative, launched late last year, accounts for just over 5 percent of the VR users on the platform.

[...] The Valve hardware survey is a self-selected voluntary sample of all Steam users and only detects VR headsets that are actively plugged in to the computer when the survey tool is run. Still, the rough parity between the two headsets is noteworthy given the Vive's use of the SteamVR standard, which Valve continues to update.

While the Rift is relatively easy to set up and use through Steam, the HTC Vive isn't officially supported on the competing Oculus Home platform.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Tuesday March 06 2018, @02:48PM (7 children)

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Tuesday March 06 2018, @02:48PM (#648510) Journal

    VR isn't going to mainstream for many years because it has to consolidate quite a bit

    That, and because it's fairly expensive ($400+ at the high end on top of everything that desktop users already have) and punishing to early adopters.

    Many cheaper headsets don't have 6DOF [thetechieguy.com] built in, almost all headsets have fields of view closer to 100-110° rather than 180-200°, and the high end headsets have the annoying cord/tether to get in your way (solvable with either a high frequency, low-range wireless connection, or by packing a lot of computing/graphics power for gaming into a standalone headset). In a few years, most of these issues could be overcome, there will be a lot more content, and the resolution and frame rates for headsets will be up.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Interesting) by Immerman on Tuesday March 06 2018, @03:12PM (6 children)

    by Immerman (3985) on Tuesday March 06 2018, @03:12PM (#648520)

    That has to be an impressive wireless connection though. A 4k screen, probably the minimum for semi-decent VR, refreshing at 90Hz and with 32-bit color requires roughly a ~24Gb connection (3GByte/second). That's more than 3x faster than the theoretical limit of an 802.11a/c connection fully utilizing all eight channels. 802.11ad has a little headroom, assuming ideal performance, and its claimed 10us lag should be plenty low, so perhaps it's just a matter of waiting for it to become affordable enough to add into the mix.

    Compression might help things somewhat, but introduces lag and provides very limited benefit, unless you're using lossy compression with the associated reduction in image quality.

    • (Score: 3, Informative) by takyon on Tuesday March 06 2018, @03:42PM (2 children)

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Tuesday March 06 2018, @03:42PM (#648530) Journal

      When I mention high frequency wireless for VR, I'm thinking of something exactly like WiGig [wikipedia.org], which has been pushed as a standard that could enable wireless headsets [venturebeat.com], and is now also called 802.11ad. I think you mean 7ms rather than 10μs.

      Compression could be introduced alongside foveated rendering [wikipedia.org]. If we can introduce artifacts only into the areas that the eyes aren't focusing on, they might not be noticed at all.

      Great image to demonstrate foveated rendering here. [theverge.com]

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Immerman on Wednesday March 07 2018, @02:26PM (1 child)

        by Immerman (3985) on Wednesday March 07 2018, @02:26PM (#648991)

        I was looking up wireless info on the fly, so it's quite possible I got 10us from marketing literature - I do also remember a line about wired-comparable lag. Perhaps there's a way to tune it for less error-resistance at the price of less lag?

        7ms would be extremely disappointing - I believe the consensus is that lag has to be below ~15ms for a pleasant sustained experience. As high as 20 may be tolerable for most, and further improvements are likely be apparent down to at least 7.

        Taking 7 out of that right off the top just for for transmission leaves only ~8ms for all game logic and rendering if we still want to hit that magic 15. Which translates to requiring twice as much performance on the remote PC to deliver the same experience, and makes 7ms completely unattainable. And it would make it absolutely essential to use a different, "lagless" technology for communicating with the sensors - an additional 7ms of lag to get head position data would be crushing.

        Lag is the real killer here - to the point that a much lower bandwidth would potentially be an acceptable price to pay. It might be possible to offload the final stage of rendering to the headset itself by feeding it heavily optimized rendering data rather than the final image.

        As for foveated rendering - it does seem like a decent way to squeeze acceptable performance out of limited hardware, but I would want to see it in action.

        I have a sinking feeling that the "optimum compromise" may for the immediate future be in leaving the PC out of it entirely, and simply accepting lower-quality rendering. Wireless Vive-class tracking with Wii-grade graphics (rendered at 2-4k of course) might actually be quite acceptable - immersion is likely a much more "killer feature" than realism. Leave realism to the folks willing to deal with either a tether or a backpack PC for a while.

        • (Score: 2) by Immerman on Wednesday March 07 2018, @02:38PM

          by Immerman (3985) on Wednesday March 07 2018, @02:38PM (#648998)

          Hmm - perhaps such a headset could draw on the R&D done for modular cellphones. It'd be really nice to be easily able to upgrade just the processor on such a potentially expensive piece of kit. Plug in the properly optimized hardware and it could possibly even do decent final-stage rendering for a PC.

    • (Score: 2) by VLM on Tuesday March 06 2018, @09:00PM (2 children)

      by VLM (445) Subscriber Badge on Tuesday March 06 2018, @09:00PM (#648688)

      World's full of people who think 1280 x 720 is high def, so lower is likely quite adequate. PS4VR looks OK enough and its specs aren't too over the top.

      Oldest trick in the book for compression is doing the graphics locally... a helmet with video card hanging off the back as a counterweight would work pretty well. Another game is 5 to 10 year old graphics coprocessing as seen in phones and tablets still sells a lot of product without requiring huge heat sinks or power.

      Something like google cardboard with somewhat higher power graphics and much better optics would probably work well.

      Of course this brings up the next problem of zero bits per second DC power. I guess a microwave magnetron pointed at a rectenna diode array on the helmet would transfer quite a few mobile watts.

      • (Score: 2) by takyon on Wednesday March 07 2018, @02:31AM

        by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Wednesday March 07 2018, @02:31AM (#648833) Journal

        I agree on 720p resolution (which is considered to be "HDTV" while 1080p is "Full HD"). I use a lot of 720p or 768p displays and it seems to be a good enough resolution for video viewing. I'm pretty sure (drank the Kool-Aid) that VR will benefit from ridiculous increases in resolution, but the focus should be on increased minimum frame rate (to above even 120 Hz), consistent frame rate, eliminating "screen tearing", etc.

        There is an emerging category of untethered "standalone headsets" such as Oculus Go [soylentnews.org], Mirage Solo [gizmodo.com], HTC Vive Focus [engadget.com], etc. that do the processing locally and seem to have decent resolution (Mirage Solo is 2560×1440). Prices for this category of devices will probably range from $150 to $400.

        They use smartphone SoCs like Snapdragon, which have decidedly less graphics power than say, an Nvidia GTX 1060. It's probable that, while a beefy graphics card may be preferred to advance realism, you don't actually need as much power when various tricks are employed. Visual detail of a scene can be sacrificed in favor of hitting minimum frame rate targets. Foveated rendering can be used with eye tracking. Google has an algorithm called "Seurat" [soylentnews.org] which could lower complexity for certain scenes, although it seems to have some constraints. Pimax is claiming to "double perceived frame rate" by alternately rendering one image at a time [soylentnews.org] (their headset has two displays at an angle like StarVR).

        I'm not sure how to feel about the standalone headsets. Sure, they remove the annoying tether that the more expensive Oculus Rift and HTC Vive have. And they offer 6 degrees of freedom using built-in positional tracking which is more convenient than external. But it seems like a waste of money and a niche market when you consider that hundreds of millions of people are constantly carrying smartphones, and could just pop them into an empty headset. Add in the VR features to new smartphones. People can carry their smartphone in their pocket, and the soft shell headset in a purse, backpack, or briefcase. If you use a Cardboard-like design, maybe you could fold the thing up and carry it in a pocket (hopefully without scratching the lenses).

        Over time, we will get to the point where 8K or 16K resolution per eye is offered:

        https://www.theverge.com/2018/1/10/16875494/pimax-8k-vr-headset-design-comfort-pixels-resolution-ces-2018 [theverge.com] (Pimax is offering 4K, ie. 3840×2160, per eye and calling it "8K")
        https://arstechnica.com/gaming/2013/09/virtual-perfection-why-8k-resolution-per-eye-isnt-enough-for-perfect-vr/ [arstechnica.com] (a relic of an article from when Palmer Luckey was still on board)
        https://www.reddit.com/r/oculus/comments/3a24p9/vr_of_the_future_according_to_amd_16k_per_eye_240/ [reddit.com] (AMD STRETCH goalz)

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Immerman on Wednesday March 07 2018, @02:32PM

        by Immerman (3985) on Wednesday March 07 2018, @02:32PM (#648995)

        I rather doubt they'd continue to agree that is was high-def when looking at an 40" screen from 6" away, which is roughly the field of view we're talking about. But yeah, it's likely good enough to make do, especially if they could blend the pixels together so they looked more like film than looking through a screen door.