Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday March 16 2018, @09:53AM   Printer-friendly
from the moar-faster-pixels dept.

Google and LG will show off an OLED display for virtual reality headsets that could have a resolution of around 5500×3000:

Google and LG are set to present an 18-megapixel 4.3-inch OLED headset display with 1443 ppi and a higher refresh rate of 120Hz during the Display Week 2018 trade show in late May. The display will have a wide field of view and high acuity. The advanced program for the expo was spotted by Android Police via OLED-Info.

Those specs make the forthcoming headset better than most of what's on the market. Screens like the new HTC Vive Pro and Oculus Rift only boast total resolutions of 2880 x 1600 and 2160 x 1200, respectively.

From the Display Week 2018 Symposium Program:

The world's highest resolution (18 megapixel, 1443 ppi) OLED-on-glass display was developed. White OLED with color filter structure was used for high-density pixelization, and an n-type LTPS backplane was chosen for higher electron mobility compared to mobile phone displays. A custom high bandwidth driver IC was fabricated. Foveated driving logic for VR and AR applications was implemented.

The competing "Pimax 8K" uses two 3840×2160 panels to hit 7680×2160 with a 200° field of view. Shipments of that headset have been delayed to April or later. A 2017 StarVR headset used two 2560×1440 panels for a 210° field of view. Two of the panels from Google and LG could add up to around 11000×3000 (based on The Verge's guess), 12000×3000 (36 megapixels), or 11314×3182 (36 megapixels, 32:9 aspect ratio).

Recall that AMD has envisioned VR resolution reaching 16K per eye (a grand total of 30720×8640, or over 265 megapixels).

List of common resolutions.

Also at UploadVR and Android Authority.

Related: Is Screen Resolution Good Enough Considering the Fovea Centralis of the Eye?
AU Optronics to Ship 8K Panels to TV Manufacturers in H1 2018


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Friday March 16 2018, @11:03AM (2 children)

    by Anonymous Coward on Friday March 16 2018, @11:03AM (#653510)

    A good moment to become blind.

    • (Score: 2) by realDonaldTrump on Friday March 16 2018, @12:14PM (1 child)

      by realDonaldTrump (6614) on Friday March 16 2018, @12:14PM (#653532) Homepage Journal

      Trust me, you won't go blind. And you won't grow hair on the palms of your hands.

      • (Score: 0) by Anonymous Coward on Friday March 16 2018, @12:31PM

        by Anonymous Coward on Friday March 16 2018, @12:31PM (#653536)

        Will you grow hair on the top of your head?

  • (Score: 2) by opinionated_science on Friday March 16 2018, @12:16PM (8 children)

    by opinionated_science (4031) on Friday March 16 2018, @12:16PM (#653534)

    This could be very useful for people with low vision...

    one of the biggest drops in perception for patients, is that the brain is unable to track fragmented visual fields, preventing "eye lock".

    Of course, these devices could make awesome games ;-)

    • (Score: 2) by cocaine overdose on Friday March 16 2018, @12:48PM (6 children)

      Awesome hentai games, you mean. I can't wait to be able to choke out a bitch in full Ultra Sonic HD, and then fuck her brains out with my invisible donger. [0] [lmgtfy.com]
    • (Score: 0) by Anonymous Coward on Saturday March 17 2018, @08:40PM

      by Anonymous Coward on Saturday March 17 2018, @08:40PM (#654208)

      The time I used a projector for my fave first person shooter, was awesome for the first 5 seconds. Then I realized a large screen put too much into the peripheral vision making me more vulnerable. I guess simulations are much more entertaining, instead.

  • (Score: 2) by jasassin on Friday March 16 2018, @01:35PM (6 children)

    by jasassin (3566) <jasassin@gmail.com> on Friday March 16 2018, @01:35PM (#653582) Homepage Journal

    Now if there was a computer that could push 125FPS at that resolution with high graphics settings. (Out of my price range.)

    --
    jasassin@gmail.com GPG Key ID: 0xE6462C68A9A3DB5A
    • (Score: 2) by takyon on Friday March 16 2018, @05:58PM (4 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday March 16 2018, @05:58PM (#653712) Journal

      The blurb mentions "Foveated driving logic for VR and AR applications was implemented." That means they are doing foveated rendering [theverge.com] to track your eyes and lower detail in areas of the screen that your eyes aren't looking at.

      That could have a substantial impact on the bandwidth and performance required. Here's an older article [roadtovr.com] that seems to be teasing this same display:

      Bavor went on to explain the performance challenges of 20 MP per eye at 90-120 fps, which works out at unreasonably high data rates of 50-100 Gb/sec. He briefly described how foveated rendering combined with eye tracking and other optical advancements will allow for more efficient use of such super high resolution VR displays.

      (Note that it sounds like they want to use two of the displays for a wide field of view.)

      Let's say that the foveated rendering technique can reduce the necessary bandwidth to 20%. Now you're down to 18 Gbps for two 18 megapixel displays at 120 FPS, which can be transmitted by the latest DisplayPort [wikipedia.org] or HDMI.

      Also note, eye tracking from SMI can track gaze direction at 250 Hz [roadtovr.com]. So you should be completely unable to "outrun the eye tracking" with your eyes, if it is implemented right. I'm not sure that you would notice if the foveated rendering failed to update for 8-16 ms anyway.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by bob_super on Friday March 16 2018, @11:07PM (3 children)

        by bob_super (1357) on Friday March 16 2018, @11:07PM (#653840)

        I just hope that your VR headset comes with noise-cancelling speakers. The machine driving that many pixels isn't gonna let you hear the virtual floorboard creaks from the monster right behind you.

        • (Score: 2) by takyon on Friday March 16 2018, @11:59PM (2 children)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday March 16 2018, @11:59PM (#653855) Journal

          Is that a joke? Because I don't see your point.

          It seems to me that foveated rendering could easily decrease the required transmission bandwidth and possibly graphics (T)FLOPS by an order of magnitude. The effect can be clearly demonstrated even when I look at this ~740x493 image [theverge.com] on a laptop screen a couple feet away from me. If I was wearing a 200° field of view headset, there would be a much larger percentage of an image that I am not focusing on at any given moment. I think paracentral vision [wikipedia.org], where the most pixels would need to be rendered, is under 5% of the human FOV (plz fact check if you can).

          The bandwidth matters only if the graphics processing is done externally. A DisplayPort cable can move a lot of pixels per second, and if foveated rendering reduces that to 10-20%, you won't even need the unreleased DisplayPort 1.5 or whatever. But cables are kind of stupid. We would like to see WiGig/802.11ad [wikipedia.org] or something similar [wikipedia.org] to wirelessly connect the headset to a desktop.

          If less GPU performance is necessary with foveated rendering, that could be a boon for standalone headsets that use an internal SoC instead of an external GPU. For example, the Lenovo Mirage Solo [tomshardware.com] will use a Snapdragon 835 (rather than the faster Snapdragon 845 [anandtech.com]) with a 2560x1440 75Hz display. It does that without using eye tracking or foveated rendering, as far as I can tell. 11314×3182 @ 120 Hz is just under 16 times more pixels per second. With a few more years of advancement in mobile GPUs, combined with foveated rendering, maybe it will be possible to make a standalone headset capable of that. Any further GPU improvements can go directly into increasing the level of detail.

          For audio, some AMD GPUs include a TrueAudio [wikipedia.org] coprocessor, and the newest ones have switched to True Audio Next, which can use the GPU to "simulate audio physics". I take this to mean the effect of sounds "bouncing" off of virtual objects, walls, floors, etc. Real-time sound and speech synthesis may also become a standard feature in future games.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by bob_super on Saturday March 17 2018, @12:08AM (1 child)

            by bob_super (1357) on Saturday March 17 2018, @12:08AM (#653861)

            A machine that computes this 20MP scene, runs the foveated algorithms, and outputs your 18G stream is, as of today, pretty f___ing loud, and not even liquid cooling is gonna make it silent. A headset receiving that much data and using it to drive a screen is also likely to need some active cooling.
            So your eyes may be in heaven, but your total immersion into your game/porn may be hindered by the noise of all the fans making it possible.

            Sure, in a few years, it will be better. For now,16x a few watts equals a lot of heat.

            • (Score: 2) by takyon on Saturday March 17 2018, @12:26AM

              by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday March 17 2018, @12:26AM (#653872) Journal

              It's a problem desktop gamers already face with certain big, hot GPUs. I don't think VR makes the problem that much worse. AMD and Nvidia will still put out hot and loud GPUs, and some gamers will buy them and put them in SLI.

              However, if we are able to go the 802.11ad/ay route, you would at least be able to sit or stand a few more meters away from the desktop (ie. the opposite end of a room). And you would also be more immersed due to lack of tether, could lay down, spin around, fall and hit your head on the coffee table, killing yourself in minutes, etc.

              --
              [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by takyon on Friday March 16 2018, @06:01PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday March 16 2018, @06:01PM (#653715) Journal
      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by VLM on Friday March 16 2018, @03:00PM (2 children)

    by VLM (445) on Friday March 16 2018, @03:00PM (#653618)

    The world's highest resolution (18 megapixel, 1443 ppi) OLED-on-glass display was developed.

    I wonder what the PPI limit is for displays. Probably something to do with brightness or heat.

    I find it interesting that display is silicon that squirts out light; if you're willing to talk about silicon that eats light and squirts out electrons, I am holding in my hands a canon 7d mk2 DSLR camera body, which was retail shipping level tech a couple years (as compared to trade show vaporware for the 1443 ppi display). If wikipedia is correct, the camera sensor at 5472 pixels across 22.4 mm would be the equivalent of 6204 pixels per inch for this camera sensor.

    So... admittedly this is apples to oranges given that one shipped years ago and the other is tradeshow vapor, but given two similar slabs of silicon, you can turn them into a light emitter or light eater, but the light eater will have sensors on the order of 1/4 the length of a light emitter.

    Initially I guessed it has something to do with optimizing heat vs quantum efficiency. You're lucky if the QE for a camera sensor is over 1/3. The 7D mk1 seems to have a QE around 35% according to some astrophotography references, I donno about the mk2 but its presumably similar and likely better. Solar cells can be higher because they don't have to provide nice color rendition or pixelized images, merely raw amps of current. Presumably the incident light power that doesn't turn into sensor electron output, turns into heat; where else would it go? Trying to figure out the equivalent spec for LED output of how much electricity turns into light vs heat, leads down many rabbit holes of "Luminous Efficacy" and similar, although it seems silicon light sources are not much more efficient than silicon light sensors WRT the fraction of heat generated. So that hypothesis turns out not to be the case. Sensors and displays are roughly like wire radio antennas, efficiency of transmission is roughly the same as efficiency of reception.

    Another guess was brightness. Superficially the DSLR can eat extremely bright light; however; the shutter speed can be extremely small fraction of total time. However, this camera is a decent, or at least usable, video recorder, so overheating is not an issue, it can eat light and dissipate heat continuously. So that hypothesis fails.

    So in summary I donno why the same process line that produced Canon DSLR sensors couldn't be very slightly modified to manufacture 6000 or so PPI displays other than the fairly obvious "nobody has bothered to try yet" or "nobody has bothered to fund the experiment yet"

    I suppose the only real use for a 6000 DPI display would be making little tiny VGA resolution screens to show pr0n videos to paramecium and amoebas and other microscopic protozoa. Whoa man, check out the cillia on that armophorea, she needs to shave because 70s protozoa pr0n doesn't sell to todays kinky tetrahymenas or vorticellas like it used to in the old days.

(1)