Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Oculus Research Presents Focal Surface Display. Will Eliminate Nausea in VR

Accepted submission by exec at 2017-05-18 15:31:49
News

Story automatically generated by StoryBot Version 0.2.2 rel Testing.
Storybot ('Arthur T Knackerbracket') has been converted to Python3

Note: This is the complete story and will need further editing. It may also be covered
by Copyright and thus should be acknowledged and quoted rather than printed in its entirety.

FeedSource: [HackerNews]

Time: 2017-05-17 15:20:39 UTC

Original URL: https://www.oculus.com/blog/oculus-research-to-present-focal-surface-display-discovery-at-siggraph/ [oculus.com] using ISO-8859-1 encoding.

Title: Oculus Research Presents Focal Surface Display. Will Eliminate Nausea in VR

--- --- --- --- --- --- --- Entire Story Below --- --- --- --- --- --- ---

Oculus Research Presents Focal Surface Display. Will Eliminate Nausea in VR

Arthur T Knackerbracket has found the following story [oculus.com]:

It’s not every day that we’re able to peel back the curtain of Oculus Research. Today we’re excited to share the team’s innovative research on focal surface displays, which has potentially far-reaching implications for improved visual fidelity in VR. Oculus Research Scientists Nathan Matsuda, Alexander Fix, and Douglas Lanman recently had their work accepted by SIGGRAPH [siggraph.org], and we couldn’t wait to share a preview.

Focal surface displays mimic the way our eyes naturally focus at objects of varying depths. Rather than trying to add more and more focus areas to get the same degree of depth, this new approach changes the way light enters the display using spatial light modulators (SLMs) to bend the headset’s focus around 3D objects—increasing depth and maximizing the amount of space represented simultaneously.

All of this adds up to improved image sharpness and a more natural viewing experience in VR.

“Quite frankly, one of the reasons this project ran as long as it did is that we did a bunch of things wrong the first time around,” jokes Research Scientist Fix. “Manipulating focus isn’t quite the same as modulating intensity or other more usual tasks in computational displays, and it took us a while to get to the correct mathematical formulation that finally brought everything together. Our overall motivation was to do things the ‘right’ way—solid engineering combined with the math and algorithms to back it up. We weren’t going to be happy with something that only worked on paper or a hacked together prototype that didn’t have any rigorous explanation of why it worked.”

By combining leading hardware engineering, scientific and medical imaging, computer vision research, and state-of-the-art algorithms to focus on next-generation VR, this project takes a highly interdisciplinary approach—one that, to the best of our knowledge, has never been tried before. It may even let people who wear corrective lenses comfortably use VR without their glasses.

“It’s very exciting to work in a field with so much potential but where many interesting challenges still haven’t been solved,” notes Matsuda, a graduate student in the Northwestern McCormick School of Engineering. “To do so as a student on a small team with highly experienced researchers has been an amazing learning process.”

While we’re a long way out from seeing results in a finished consumer product, this emerging work opens up an exciting and valuable new direction for future research to explore. We’re committed to publishing research results that stand to benefit the VR/AR industry as a whole.

“It’s no secret that multiple academic and industrial teams are racing to move beyond fixed-focus headsets,” explains Lanman. “Vergence-accommodation conflict (VAC), eyeglasses prescriptions, and sharp viewing of near objects all motivate adjusting the focus of a VR display. As a researcher, I’m excited to share what our team has uncovered. That’s the joy of publishing—it opens the door to anyone building upon your efforts. As long as you’re thick-skinned enough, you should prepare to be surprised how much further your work can be carried by the worldwide academic community. The greatest challenge is getting other researchers excited enough to do that follow-on work, and I’m looking forward to attempting that at SIGGRAPH.”

Stay tuned to the blog next week for an in-depth profile of the team behind this exciting research.

— The Oculus Team

-- submitted from IRC


Original Submission