Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday January 02 2018, @02:54PM   Printer-friendly
from the who-will-be-first-to-put-an-8K-display-in-a-cell-phone? dept.

LG is showing off the world's largest and highest resolution OLED panel in an 88-inch TV at the Consumer Electronics Show:

Just as 4K and HDR are finally going mainstream, the ambitious folks at LG Display have also been busy pushing its OLED technology to 8K. Come CES, the Korean manufacturer will be letting attendees get up close with its new 88-inch [2.2 meter] 8K OLED display (can we just call it the "Triple 8?"), which is both the largest and the highest-resolution OLED panel to date. But as far as specs go, that's all we have for now.

Also at The Verge and BGR.


Original Submission

Related Stories

AU Optronics to Ship 8K Panels to TV Manufacturers in H1 2018 21 comments

More 8K (4320p) TVs will be coming soon. AU Optronics has announced plans to ship 8K panels to TV manufacturers starting in the first half of 2018:

The lineup of panels featuring a 7680×4320 resolution will be aimed at ultra-high-end TVs and sizes will range from 65 to 85 inches, said Liao Wei-Lun, president of AUO's video products business group, at a press conference. The high-ranking executive did not disclose other specifications of the panels, such as luminance and contrast ratio, but given their positioning, it is logical to expect their characteristics to be comparable to 8K UHDTVs to be offered by LG and Samsung.

Multiple TV makers demonstrated various 8K UHDTVs at various trade shows in the recent years, but so far no one has started to sell them. Given the lack of content, it is hard to expect high demand for 8K televisions in the next couple of years, aside from the halo factor - nonetheless, AUO expects 8K panels to account for 10% of its '65-inch and above' panel shipments in 2020. The presumably high-cost of the panels would indicate that in terms of unit shipments this might still be a low-ish number. However, as with 4K displays, someone has to release 8K TVs to stimulate content providers to offer appropriate material. At this year's CES, Samsung demonstrated its Q9S, its first commercial 8K TV-set, but it did not announce its pricing or availability timeframe. LG and Sony also demonstrated their 8K TVs at CES 2018, but nothing is clear about their plans regarding these products.

[...] As for 8K displays for PCs, Dell is currently the only company to offer an 8K monitor (this one is based on a panel from LG, so the latter might introduce its own 8K display at some point). Philips last year promised to start shipments 328P8K monitor in 2018, so expect the product to hit the market in the coming months too.

Need something to watch on your 8K TV? How about the 2020 Olympics?

Also at DigiTimes.

Related: LG to Demo an 8K Resolution TV at the Consumer Electronics Show
Dell Announces First "Mass-Market" 8K Display
Philips Demos an 8K Monitor
Pimax Launches Kickstarter for "8K" Virtual Reality Headset
HDMI 2.1 Released
LG's 88-inch 8K OLED TV


Original Submission

LG Can't Meet Apple's Demand for iPhone OLED Displays 8 comments

LG Display reportedly can't meet Apple's demand for OLED screens due to manufacturing issues. This means that Apple will once again be reliant on its primary supplier and smartphone rival, Samsung:

Analysts have been warning for months that Apple is in "urgent" need of finding another iPhone OLED supplier besides Samsung. Apple currently uses Samsung's OLED displays for the company's iPhone X model. The reliance on a single supplier means Samsung controls pricing on the displays that Apple is buying — and there's no other alternative at the moment.

Also at WSJ and MacRumors.

Related: LG's 88-inch 8K OLED TV
Apple, Valve, and LG Invest in OLED Manufacturer eMagin
Google and LG to Show Off World's Highest Resolution OLED-on-Glass Display in May
Apple Building its Own MicroLED Displays for Eventual Use in Apple Watch and Other Products


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Funny) by looorg on Tuesday January 02 2018, @03:47PM (14 children)

    by looorg (578) on Tuesday January 02 2018, @03:47PM (#616732)

    Just hearing people talk about their 4K screens is annoying. I guess they'll soon update then and start to talk shit about how dreadful it is with only 4k and how they just can't watch anything lower then 8K. Once it becomes the new normal I guess LG (or whomever) will start to push their 16K screens. It's so real is more real then reality or whatever the slogan will be.

    • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @03:54PM (10 children)

      by Anonymous Coward on Tuesday January 02 2018, @03:54PM (#616735)

      Meanwhile, there's so little content other than porn in 4k that it doesn't make sense except for jacking off

      • (Score: 3, Touché) by massa on Tuesday January 02 2018, @04:36PM

        by massa (5547) on Tuesday January 02 2018, @04:36PM (#616757)

        or, as it is better known, "everyday use" :-)

      • (Score: 4, Informative) by TheRaven on Tuesday January 02 2018, @04:46PM (8 children)

        by TheRaven (270) on Tuesday January 02 2018, @04:46PM (#616762) Journal
        I don't see the need for it for video, but 4K makes a huge difference for text rendering at normal monitor viewing distances. You barely need antialiasing and what AA you do have is such a small proportion of the total line width that it makes text significantly crisper. 8K would probably completely eliminate the need for AA, but it's well into diminishing returns at that point (unless you're using a screen that's so big that you need to move your head to see all of it).
        --
        sudo mod me up
        • (Score: 1, Insightful) by Anonymous Coward on Tuesday January 02 2018, @05:36PM (3 children)

          by Anonymous Coward on Tuesday January 02 2018, @05:36PM (#616786)

          If you went back 10 years ago and replaced "4K" with "1920x1200" you would have been able to say the same things.

          But the future always sucked because mainstream adoption went to 1920x1080.

          They couldn't even upgrade from 1024x768 to something cool like 1280x1024 or 1680x1050 to tide them over until 1920x1200--no it had to be a stupid TV resolution.

          And now we have "4K" which isn't 4K in either direction. They don't call 1080 2K and they don't call 2160... 2K

          which helps describe that the present state of affairs is marketing to ignorant people. Is 8K something like 7168? I mean that's closer to 7k than 8k, and no one talks about that little second number anymore... it's like height is bad or something. I guess letterboxes suck? too bad monitors can make use of all of that space... oh wait right ignorant people and marketing, nvm

          • (Score: 3, Interesting) by bob_super on Tuesday January 02 2018, @07:10PM

            by bob_super (1357) on Tuesday January 02 2018, @07:10PM (#616828)

            TV "4K" (3840x2160) has a lot of infrastructure in common with movie 4K (4096x2160). The content industry is very happy that SMPTE finally got some of their duck in a row and provided standards (manufacturer-driven, obviously) to simplify many of the workflows. 2K and 4K were originally the shortcuts for the movie side of things (DCM). 8K makes sense because it's 2x each way again.

            I did miss 1920x1200 for years (lost 10% of the lines, unless you went portrait mode), until it dropped to $400 for the 2160-line 40-inch "monitor" on which I'm currently typing. There will not be a need for 8K on my desktop, unless I also get very good reading glasses.

          • (Score: 2) by TheRaven on Wednesday January 03 2018, @03:03PM

            by TheRaven (270) on Wednesday January 03 2018, @03:03PM (#617169) Journal

            If you went back 10 years ago and replaced "4K" with "1920x1200" you would have been able to say the same things.

            Nope, I had a 1920x1200 screen 10 years (actually, 13) ago and it still needed sub-pixel AA for text to not appear jagged. Actually, I did briefly use a 4K monitor then - an IBM model that used two DVI connectors do get sufficient bandwidth. I now have a 4K screen that's the same size, and there's a small difference if I turn off AA.

            --
            sudo mod me up
          • (Score: 1) by toddestan on Friday January 05 2018, @02:25AM

            by toddestan (4982) on Friday January 05 2018, @02:25AM (#618171)

            If you went back 10 years ago and replaced "4K" with "1920x1200" you would have been able to say the same things.

            But the future always sucked because mainstream adoption went to 1920x1080.

            The vast majority of 1920x1200 monitors were 24" (or even larger), which was dumb as it was less than 100 DPI. Same with the also common 1680x1050 at 22" screens. You actually got more PPI out the once common 1280x1024 17" monitors. I never understood why the 16:10 monitors were that way - they really needed to bump the resolution up one step for each of the common screen sizes, i.e. 1920x1200 should have been 22", 2560x1600 should have been 24", etc.

            One think they did get kind of right with 1920x1080 is the 21.5" screen size you could get them in which was at least above 100 PPI.

        • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @05:38PM

          by Anonymous Coward on Tuesday January 02 2018, @05:38PM (#616788)

          It also depends on view distance and screen size. The problem with these bigger and higher resolution screens is that many people are watching their TV from across the room, where the increase in detail doesn't matter. There is a reason why TV stores don't let you see anything but their largest at any distance resembling normal. See here for more: http://s3.carltonbale.com/resolution_chart.html [carltonbale.com]

        • (Score: 3, Insightful) by TheLink on Tuesday January 02 2018, @05:44PM

          by TheLink (332) on Tuesday January 02 2018, @05:44PM (#616795) Journal
          I'm not excited at all about big screens. I'm waiting for _small_ 8k screens per eye (and the hardware and optics for them). That should make things a lot better for AR/VR. Then we can have screens as huge and as many as our hardware and wetware can cope with.

          And hopefully OS builders actually make GUIs and OSes that _help_ with making such stuff easy for humans and augment us rather than do retarded stuff like dumbing down desktop/workstations to tablet style UIs.

          Intel and AMD might get excited about that... There's not as much need for 32 cores and 128GB of RAM if your desktop OS only makes it easy for you to do tablet crap.
        • (Score: 1, Informative) by Anonymous Coward on Tuesday January 02 2018, @08:09PM (1 child)

          by Anonymous Coward on Tuesday January 02 2018, @08:09PM (#616865)

          You barely need antialiasing [with 4K displays]

          Screen resolution has almost to nothing to do with anti-aliasing.

          In computer graphics, aliasing normally refers to a type of quantization error caused by rendering to a fixed pattern of pixels (in the case of a computer monitor this is a rectangular grid). It occurs when the source information does not exactly align with the pixel grid. Normally these errors are periodic and highly non-random, resulting in significant damage to the original signal (it is called "aliasing" because different signals become indistinguishable -- hence aliases). The effect is the same sort of error you see in a Moiré pattern [wikipedia.org].

          In this context, anti-aliasing is a form of dither which adds randomness to remove aliasing errors as much as possible, replacing it with a small amount of white noise. Anti-aliasing is always required for correct rendering on a fixed grid of pixels like a computer monitor -- regardless of the resolution of that pixel grid.

          Nevertheless, up to a certain point, higher resolution will look better because the noise floor from dithering will be reduced. The ~100 pixels/inch displays of yesteryear are definitely not ideal for small text so I say bring out more pixels!

          (NB: When done correctly, anti-aliasing will completely eliminate all aliasing errors (with high probability) for signals above the noise floor by the use of random sampling. However, in raster graphics it is very common to use post-processing techniques to hide visible artifacts rather than true anti-aliasing. Such techniques are sometimes confusingly called "anti-aliasing" even though these don't actually do anything to prevent aliasing!)

          • (Score: 0) by Anonymous Coward on Wednesday January 03 2018, @10:39AM

            by Anonymous Coward on Wednesday January 03 2018, @10:39AM (#617119)

            You miss the point completely and are overrated.

            Anti-aliasing is always required for correct rendering on a fixed grid of pixels like a computer monitor -- regardless of the resolution of that pixel grid.

            Not if you are viewing a very high resolution grid with comparatively low resolution eyes.

            Then the limitations of your eyes will do the "anti-aliasing". No need for the grid to do any of that.

    • (Score: 3, Insightful) by LoRdTAW on Tuesday January 02 2018, @05:32PM

      by LoRdTAW (3755) on Tuesday January 02 2018, @05:32PM (#616782) Journal

      One word which justifies high DPI and resolution screens: text.

    • (Score: 3, Funny) by takyon on Tuesday January 02 2018, @05:33PM (1 child)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday January 02 2018, @05:33PM (#616784) Journal

      Looking at 4K is like smearing hydrochloric acid in my eyeball. That's why I only use 16K screens at a minimum.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by looorg on Tuesday January 02 2018, @11:12PM

        by looorg (578) on Tuesday January 02 2018, @11:12PM (#616962)

        Please hold still I don't want to get this shit on my fingers ...

  • (Score: 2) by DannyB on Tuesday January 02 2018, @05:43PM (2 children)

    by DannyB (5839) Subscriber Badge on Tuesday January 02 2018, @05:43PM (#616793) Journal

    So now we have the video 8K, 16K, 32K, etc people who are the equivalent of audiophiles who claim that a $1,000 speaker sounds so much worse than their $25,000 speaker. Even though nobody else can hear the difference. And of course, you need a pear.

    Then you get Monster Cable type claims, that gold-plated Ethernet cables for audio sound so much better. Even though a distorted 1 or 0 on a digital signal sounds just as perfect as a well formed 1 or 0 once it hits the DAC.

    Paid for by Americans for Renewable Complaining and Sustainable Whining.

    --
    The lower I set my standards the more accomplishments I have.
    • (Score: 3, Informative) by takyon on Tuesday January 02 2018, @05:58PM (1 child)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday January 02 2018, @05:58PM (#616804) Journal

      The use case for 16K involves strapping it to your face. Very close to your face, and distorted by lenses as well. 16K [wikipedia.org] per eye (total of 30720 x 8640) ensures a wide field of view which should cover most of your peripheral vision [wikipedia.org].

      https://wccftech.com/interview-amd-liquid-vr-guennadi-riguer/ [wccftech.com]

      AMD has a road map in place for the evolution of LiquidVR that will grow as the industry grows. Keep in mind however, that this roadmap is far from set in concrete as you will find out later on in this article. AMD is chasing the perfect standard here which Riguer dubbed as “VR Nirvana”: 16k resolution (for each eye) and a refresh rate of 144 Hz. Although, this won’t be possible for at least a few generations of GPUs maybe we will be able to see something close to that by 2020. 16k resolution would be ideal for VR because naturally, the higher the resolution of the eye piece, the less the “screen door” effect. Not only that, but a refresh rate of 144 Hz would eliminate any and all nausea issues associated with low refresh rates. For the time being however, the Oculus standard of 2k 90fps (1080p each eye) will have to do.

      http://www.legitreviews.com/amds-roy-taylor-talks-future-vr-capsaicin-cream-event_191983 [legitreviews.com]

      At the AMD Capsaicin and Cream event, today, [February 28, 2017,] Roy Taylor talked extensively about the future of VR, including delivering VR at 120 FPS with 16K resolution in the coming years.

      https://overclock3d.net/news/gpu_displays/amd_s_raja_koduri_says_that_we_need_16k_at_240hz_for_true_immersion_in_vr/1 [overclock3d.net]

      AMD's Raja Koduri, the head of the Radeon Technology group, has said for a long time that VR is going to the a driving force behind the advancements in GPU performance for many years to come and that he won't be happy until we can run games a 16K at 240Hz within his lifetime, saying that that would be the point where we will achieve "true immersion that you won't be able to tell apart from the real-world".

      Note that I haven't even mentioned 32K, just ultra-wide 16K. Who needs 32K? The virtuophiles!

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 1, Informative) by Anonymous Coward on Wednesday January 03 2018, @10:43AM

        by Anonymous Coward on Wednesday January 03 2018, @10:43AM (#617120)

        Not only that, but a refresh rate of 144 Hz would eliminate any and all nausea issues associated with low refresh rates.

        Need low latency too. Many people will still get motion sickness with a refresh rate of 10000Hz if what they see is too delayed from what their ears tell them they should see.

  • (Score: 5, Interesting) by digitalaudiorock on Tuesday January 02 2018, @08:12PM

    by digitalaudiorock (688) on Tuesday January 02 2018, @08:12PM (#616867) Journal

    I have an LG OLED55B6P that I bought last year. The picture quality of OLED really is second to none, primarily because of the true black. Black pixels are just like the TV being shut off. Can't say enough about it. It makes even the best plasma TVs look just OK.

    HOWEVER: Since LG has decided their current corner on the OLED market means they can screw customers, I can't in good conscience recommend anything of theirs. Several months ago, the first major version update to WebOS 3 (from 4.x to 5.30.03) actually broke the main thing you're paying for...the true black [avsforum.com]. It's subtle enough that many have not noticed it, so LG has decided not to even acknowledge the problem let alone fix it. It's pretty much "sucks to be someone who noticed isn't it?". There's plenty about it on the owners thread here [avsforum.com].

    LG offers no way to revert firmware to a version that actually works, so industrious users have figured out how to set up a dummy web server of your own to trick it into downgrading (noted in the above threads). I just did that yesterday myself yesterday, and blocked the LG update sites on my router, and everything's great again. In the process I also discovered that network issues using VUDU had nothing to do with VUDU or their app, but was also caused by this cluster fuck of LGs, and works perfectly with the old 4.31.20 LG firmware. They simply released a total lemon of an OS upgrade and aren't doing shit about it.

    If this post prevents even one person from buying their shit, I'll be happy. Hit the bastards where it hurts.

  • (Score: 2) by clone141166 on Sunday January 07 2018, @02:11PM

    by clone141166 (59) on Sunday January 07 2018, @02:11PM (#619151)

    ...

(1)