Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday January 02 2018, @02:54PM   Printer-friendly
from the who-will-be-first-to-put-an-8K-display-in-a-cell-phone? dept.

LG is showing off the world's largest and highest resolution OLED panel in an 88-inch TV at the Consumer Electronics Show:

Just as 4K and HDR are finally going mainstream, the ambitious folks at LG Display have also been busy pushing its OLED technology to 8K. Come CES, the Korean manufacturer will be letting attendees get up close with its new 88-inch [2.2 meter] 8K OLED display (can we just call it the "Triple 8?"), which is both the largest and the highest-resolution OLED panel to date. But as far as specs go, that's all we have for now.

Also at The Verge and BGR.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @03:54PM (10 children)

    by Anonymous Coward on Tuesday January 02 2018, @03:54PM (#616735)

    Meanwhile, there's so little content other than porn in 4k that it doesn't make sense except for jacking off

  • (Score: 3, Touché) by massa on Tuesday January 02 2018, @04:36PM

    by massa (5547) on Tuesday January 02 2018, @04:36PM (#616757)

    or, as it is better known, "everyday use" :-)

  • (Score: 4, Informative) by TheRaven on Tuesday January 02 2018, @04:46PM (8 children)

    by TheRaven (270) on Tuesday January 02 2018, @04:46PM (#616762) Journal
    I don't see the need for it for video, but 4K makes a huge difference for text rendering at normal monitor viewing distances. You barely need antialiasing and what AA you do have is such a small proportion of the total line width that it makes text significantly crisper. 8K would probably completely eliminate the need for AA, but it's well into diminishing returns at that point (unless you're using a screen that's so big that you need to move your head to see all of it).
    --
    sudo mod me up
    • (Score: 1, Insightful) by Anonymous Coward on Tuesday January 02 2018, @05:36PM (3 children)

      by Anonymous Coward on Tuesday January 02 2018, @05:36PM (#616786)

      If you went back 10 years ago and replaced "4K" with "1920x1200" you would have been able to say the same things.

      But the future always sucked because mainstream adoption went to 1920x1080.

      They couldn't even upgrade from 1024x768 to something cool like 1280x1024 or 1680x1050 to tide them over until 1920x1200--no it had to be a stupid TV resolution.

      And now we have "4K" which isn't 4K in either direction. They don't call 1080 2K and they don't call 2160... 2K

      which helps describe that the present state of affairs is marketing to ignorant people. Is 8K something like 7168? I mean that's closer to 7k than 8k, and no one talks about that little second number anymore... it's like height is bad or something. I guess letterboxes suck? too bad monitors can make use of all of that space... oh wait right ignorant people and marketing, nvm

      • (Score: 3, Interesting) by bob_super on Tuesday January 02 2018, @07:10PM

        by bob_super (1357) on Tuesday January 02 2018, @07:10PM (#616828)

        TV "4K" (3840x2160) has a lot of infrastructure in common with movie 4K (4096x2160). The content industry is very happy that SMPTE finally got some of their duck in a row and provided standards (manufacturer-driven, obviously) to simplify many of the workflows. 2K and 4K were originally the shortcuts for the movie side of things (DCM). 8K makes sense because it's 2x each way again.

        I did miss 1920x1200 for years (lost 10% of the lines, unless you went portrait mode), until it dropped to $400 for the 2160-line 40-inch "monitor" on which I'm currently typing. There will not be a need for 8K on my desktop, unless I also get very good reading glasses.

      • (Score: 2) by TheRaven on Wednesday January 03 2018, @03:03PM

        by TheRaven (270) on Wednesday January 03 2018, @03:03PM (#617169) Journal

        If you went back 10 years ago and replaced "4K" with "1920x1200" you would have been able to say the same things.

        Nope, I had a 1920x1200 screen 10 years (actually, 13) ago and it still needed sub-pixel AA for text to not appear jagged. Actually, I did briefly use a 4K monitor then - an IBM model that used two DVI connectors do get sufficient bandwidth. I now have a 4K screen that's the same size, and there's a small difference if I turn off AA.

        --
        sudo mod me up
      • (Score: 1) by toddestan on Friday January 05 2018, @02:25AM

        by toddestan (4982) on Friday January 05 2018, @02:25AM (#618171)

        If you went back 10 years ago and replaced "4K" with "1920x1200" you would have been able to say the same things.

        But the future always sucked because mainstream adoption went to 1920x1080.

        The vast majority of 1920x1200 monitors were 24" (or even larger), which was dumb as it was less than 100 DPI. Same with the also common 1680x1050 at 22" screens. You actually got more PPI out the once common 1280x1024 17" monitors. I never understood why the 16:10 monitors were that way - they really needed to bump the resolution up one step for each of the common screen sizes, i.e. 1920x1200 should have been 22", 2560x1600 should have been 24", etc.

        One think they did get kind of right with 1920x1080 is the 21.5" screen size you could get them in which was at least above 100 PPI.

    • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @05:38PM

      by Anonymous Coward on Tuesday January 02 2018, @05:38PM (#616788)

      It also depends on view distance and screen size. The problem with these bigger and higher resolution screens is that many people are watching their TV from across the room, where the increase in detail doesn't matter. There is a reason why TV stores don't let you see anything but their largest at any distance resembling normal. See here for more: http://s3.carltonbale.com/resolution_chart.html [carltonbale.com]

    • (Score: 3, Insightful) by TheLink on Tuesday January 02 2018, @05:44PM

      by TheLink (332) on Tuesday January 02 2018, @05:44PM (#616795) Journal
      I'm not excited at all about big screens. I'm waiting for _small_ 8k screens per eye (and the hardware and optics for them). That should make things a lot better for AR/VR. Then we can have screens as huge and as many as our hardware and wetware can cope with.

      And hopefully OS builders actually make GUIs and OSes that _help_ with making such stuff easy for humans and augment us rather than do retarded stuff like dumbing down desktop/workstations to tablet style UIs.

      Intel and AMD might get excited about that... There's not as much need for 32 cores and 128GB of RAM if your desktop OS only makes it easy for you to do tablet crap.
    • (Score: 1, Informative) by Anonymous Coward on Tuesday January 02 2018, @08:09PM (1 child)

      by Anonymous Coward on Tuesday January 02 2018, @08:09PM (#616865)

      You barely need antialiasing [with 4K displays]

      Screen resolution has almost to nothing to do with anti-aliasing.

      In computer graphics, aliasing normally refers to a type of quantization error caused by rendering to a fixed pattern of pixels (in the case of a computer monitor this is a rectangular grid). It occurs when the source information does not exactly align with the pixel grid. Normally these errors are periodic and highly non-random, resulting in significant damage to the original signal (it is called "aliasing" because different signals become indistinguishable -- hence aliases). The effect is the same sort of error you see in a Moiré pattern [wikipedia.org].

      In this context, anti-aliasing is a form of dither which adds randomness to remove aliasing errors as much as possible, replacing it with a small amount of white noise. Anti-aliasing is always required for correct rendering on a fixed grid of pixels like a computer monitor -- regardless of the resolution of that pixel grid.

      Nevertheless, up to a certain point, higher resolution will look better because the noise floor from dithering will be reduced. The ~100 pixels/inch displays of yesteryear are definitely not ideal for small text so I say bring out more pixels!

      (NB: When done correctly, anti-aliasing will completely eliminate all aliasing errors (with high probability) for signals above the noise floor by the use of random sampling. However, in raster graphics it is very common to use post-processing techniques to hide visible artifacts rather than true anti-aliasing. Such techniques are sometimes confusingly called "anti-aliasing" even though these don't actually do anything to prevent aliasing!)

      • (Score: 0) by Anonymous Coward on Wednesday January 03 2018, @10:39AM

        by Anonymous Coward on Wednesday January 03 2018, @10:39AM (#617119)

        You miss the point completely and are overrated.

        Anti-aliasing is always required for correct rendering on a fixed grid of pixels like a computer monitor -- regardless of the resolution of that pixel grid.

        Not if you are viewing a very high resolution grid with comparatively low resolution eyes.

        Then the limitations of your eyes will do the "anti-aliasing". No need for the grid to do any of that.