Slash Boxes

SoylentNews is people

posted by mrpg on Thursday July 19 2018, @07:30PM   Printer-friendly
from the now-it-looks-like-the-20st-century dept.

VR rivals come together to develop a single-cable spec for VR headsets

Future generations of virtual reality headsets for PCs could use a single USB Type-C cable for both power and data. That's thanks to a new standardized spec from the VirtualLink Consortium, a group made up of GPU vendors AMD and Nvidia and virtual reality rivals Valve, Microsoft, and Facebook-owned Oculus.

The spec uses the USB Type-C connector's "Alternate Mode" capability to implement different data protocols—such as Thunderbolt 3 data or DisplayPort and HDMI video—over the increasingly common cables, combined with Type-C's support for power delivery. The new headset spec combines four lanes of HBR3 ("high bitrate 3") DisplayPort video (for a total of 32.4 gigabits per second of video data), along with a USB 3.1 generation 2 (10 gigabit per second) data channel for sensors and on-headset cameras, along with 27W of electrical power.

That much video data is sufficient for two 3840×2160 streams at 60 frames per second, or even higher frame rates if Display Stream Compression is also used. Drop the resolution to 2560×1440, and two uncompressed 120 frame per second streams would be possible.

Framerate is too low, and it's not wireless. Lame.

VirtualLink website. Also at The Verge.

Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Friday July 20 2018, @12:20AM (1 child)

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Friday July 20 2018, @12:20AM (#709695) Journal

    If you're at 90-120 FPS, you have a few milliseconds to work with. The headset could have a dedicated component for decompression. However, if the endgame is around 240 FPS [], things are going to get tricky.

    A tightly integrated standalone VR SoC may be much better equipped to deal with latency than a gaming PC, and would not have to deal with the wireless transmission issue at all. Yes, that means that we eventually want GPU performance that exceeds today's big ~300 W cards in the footprint of a <5 W smartphone SoC.

    If this is anything to go by [], they want to reduce the screen door effect and other issues by massively increasing pixel density. However, the same exact company (LG) is also experimenting with using a diffusion filter []. So there are multiple ways in development to alleviate the issue.

    [SIG] 10/28/2017: Soylent Upgrade v14 []
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Immerman on Friday July 20 2018, @01:43AM

    by Immerman (3985) on Friday July 20 2018, @01:43AM (#709739)

    Refresh rate and lag are completely independent properties - it's perfectly possible to run at 120FPS with several seconds of lag - lots of TVs offer 120Hz refresh rates with well over 100ms of lag, meaning that the image displayed is more than 10 frames behind the image received (ignoring the fact that half the frames are actually computer generated "filler" that were never received in the first place) . But if you have more than about 15-20ms of lag in VR you're likely to get sick quickly, and as I recall you have to get down below about 3-5ms before the nausea disappears completely for most people - There is currently no consumer hardware that can get anywhere close to that, and I haven't heard any credible claims that refresh rate has any effect on how much lag is tolerable.

    Shit, they were actually able to patent using a diffusion filter? I sure hope they did something really clever with it, because using a diffusion filter to eliminate screen doors and blockiness is incredibly obvious - I've been advocating it since before the first occulus developer kits were shipped.