Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Sunday April 17 2016, @02:18AM   Printer-friendly
from the how-many-pixels-are-enough dept.

3D and 4K were nothing! It's all about HDR now!

Netflix has confirmed it has begun its rollout of high dynamic range content on its TV and film streaming service. HDR videos display millions more shades of colour and extra levels of brightness than normal ones, allowing images to look more realistic.

However, to view them members will need a new type of TV or monitor and a premium-priced Netflix subscription. Some HDR content had already been available via Amazon's rival Instant Video service. Ultra-high-definition 4K Blu-ray discs - which launched in the UK earlier this week - also include HDR data.

Netflix's support follows January's creation of a scheme defining the HDR standards a television set must meet to be marketed with an "Ultra HD Premium" sticker. [...] The US firm recommends its members have at least a 25 megabits per second connection to view them.

High-dynamic-range imaging at Wikipedia.

Related:

A Look at AMD's GPU Plans for 2016
LG to Demo an 8K Resolution TV at the Consumer Electronics Show


Original Submission

Related Stories

A Look at AMD's GPU Plans for 2016 16 comments

Following the recent reorganization of AMD's (Advanced Micro Devices) GPU assets under the Radeon Technologies Group (RTG), AMD is talking about its 2016 GPU plans.

2016 Radeon GPUs will support DisplayPort 1.3 & HDMI 2.0a. DisplayPort 1.3 will allow for 5K and even 8K resolution over a single cable, as well as 4K at the higher refresh rates (75+ Hz) needed for AMD's FreeSync Low Framerate Compensation to work. FreeSync will also work over HDMI (which is cheaper and more commonly used than DisplayPort):

Implemented over a customized version of HDMI 1.4a and utilizing a prototype Realtek timing controller, AMD was able to demonstrate variable refresh rate technology running over HDMI. At the time of the presentation AMD was very clear that the purpose of the presentation was to shop around the concept and to influence the various members of the HDMI consortium, but they were also clear that bringing variable refresh rate tech to HDMI was something the company wanted to bring to retail sooner than later. Sooner, as it turns out, was the operative word there. As part of their presentation last week, RTG has announced that FreeSync over HDMI will be heading to retail, and that it will be doing so very soon: Q1'16. This is just a year after the first DisplayPort adaptive sync monitors hit retail, which for a display technology is a rather speedy turnaround from proof of concept to retail product.

The first FreeSync-capable laptop, the Lenovo Y700, was announced by the RTG, however it only supports a small range from 40 Hz to 60 Hz.

[More after the break.]

LG to Demo an 8K Resolution TV at the Consumer Electronics Show 35 comments

LG will show off a "Super UHD" 98-inch 8K resolution (7680×4320) TV set at the upcoming Consumer Electronics Show (Jan. 6-9). It will also launch three 4K sets with high dynamic range (HDR) capability:

The super-slim design of the UH9500-series TVs have almost invisible bezels and a screen depth of just 6.6mm-that's less than a quarter-inch at its thinnest points. Screen sizes of the 4K models range from 49 to 86 inches. In addition to the three models, LG will also offer a standalone, attention-grabbing Super UHD TV with a huge 98-inch 8K screen.

[...] All sets will also include LG's IPS panel – noted for its advanced off-axis performance – further enhanced by two new LG technologies called True Black Panel and Contrast Maximizer, aimed at improving IPS' typically underwhelming black levels by reducing reflections and maximizing contrast by separating objects from their backgrounds, according to LG. The TVs also include SDR-to-HDR conversion to deliver near-HDR quality from standard sources.

CNET, SlashGear, The Verge.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by c0lo on Sunday April 17 2016, @02:27AM

    by c0lo (156) on Sunday April 17 2016, @02:27AM (#333059) Journal

    I still have an old CRT monitor capable of 2048x1536.
    I'm still waiting for a flat display capable of the same but, looking at the current trend, it may be I'll be too old to care or dead when they'll get it out on the market.

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0
    • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @02:57AM

      by Anonymous Coward on Sunday April 17 2016, @02:57AM (#333082)

      I still have an old CRT monitor capable of 2048x1536.
      I'm still waiting for a flat display capable of the same but, looking at the current trend, it may be I'll be too old to care or dead when they'll get it out on the market.

      Indeed. It is curious to me that they keep ratcheting up 3D, 4K, and now HDR when most of us won't be able to notice any "enhancement" in anything but the bandwidth and download time to get at their content. Who will actually benefit from this, other than the ISPs expecting to rake in lots of money on overage fees?

      • (Score: 2) by c0lo on Sunday April 17 2016, @03:22AM

        by c0lo (156) on Sunday April 17 2016, @03:22AM (#333097) Journal

        Who will actually benefit from this, other than the ISPs expecting to rake in lots of money on overage fees?

        Video equipment manufacturers.
        And garbage disposal/recycling operators - even if the latter may not be much healthy.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0
        • (Score: 3, Insightful) by frojack on Sunday April 17 2016, @04:13AM

          by frojack (1554) Subscriber Badge on Sunday April 17 2016, @04:13AM (#333121) Journal

          Who will beneift? TV manufacturers struggling to find any means to get you to buy a new TV.

          3D didn't seduce me. 4K still not enough. This HDR is unlikely to work either.

          I'm running a 9 year old Plasma and I see nothing I need from newer displays.

          And even if I did want a new TV, a built in microphone and camera is definitely on my interdict list.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @04:18AM

            by Anonymous Coward on Sunday April 17 2016, @04:18AM (#333123)

            The only benefit the newer TVs would have for me over a plasma television would be the reduced power cost. Although, I'm not sure how big of a difference that would be, especially as TVs get bigger and require more processing power and power for the pixels and what not.

            • (Score: 2) by frojack on Sunday April 17 2016, @04:51AM

              by frojack (1554) Subscriber Badge on Sunday April 17 2016, @04:51AM (#333131) Journal

              I haven't shopped for a newer unit wither, but with the power we save switching just about every light in the house to LED, we've already saved more than we would have saved with replacing that Plasma.

              This page suggests LED uses a quarter the power that Plasma does:
              http://www.cnet.com/news/what-you-need-to-know-about-tv-power-consumption/ [cnet.com]

              --
              No, you are mistaken. I've always had this sig.
              • (Score: 1) by anubi on Sunday April 17 2016, @09:41AM

                by anubi (2828) on Sunday April 17 2016, @09:41AM (#333196) Journal

                Frojack: I am very impressed with the VISIO's I bought. LED backlights. And they do not get hot in the back.

                There are many brands out there... I really can't tell you if one is any better than the rest... I just stated the one I ended up with. It had all the things I wanted - 1080p, LED backlight, TV, VGA in, and would run from 12 volts DC.

                They also have a lot higher resolution letting me use them as computer monitors as well as TV.

                You might wanna go check some out.

                ( The heat not much of concern in the winter, but really expensive to aircondition it back out in the summer ).

                --
                "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
          • (Score: 2, Interesting) by anubi on Sunday April 17 2016, @09:32AM

            by anubi (2828) on Sunday April 17 2016, @09:32AM (#333191) Journal

            Everything I had a couple of years ago was NEC Multisync. I ended up donating them to Goodwill after I purchased several VISIO Full HD TV's ( 1080p ) at WalMart, and convinced myself that they were reliable enough to use. Not only that, the VISIO's had VGA inputs, which were very important to me.

            I still have not got a warm comfy feeling for HDMI, as I still think of it as a DRM mechanism, that may one day arbitrarily begin enforcing someone's else's wish in me. I KNOW the analog VGA is reliable, I do not have near that much confidence that if I plug in a HDMI device, that it will work. For a while there, I has lost confidence that even NTSC could be counted on to work, until I found out how macrovision worked and saw how to regenerate the sync. So I have not acquired anything that demands HDMI streams. My laptop will deliver HDMI, but before I plunked money down for it, I made sure it would also put out VGA just in case some rightsholder gets the hots for messing my stuff up after I have paid for it - thinking I could use it.

            I see the new stuff, and it sure looks tempting, but I am also very concerned about giving up my control of it. I would much rather watch a movie in black and white, NTSC, than to have the latest ultra-high-definition presentation forcing ads on me every three minutes. I feel the ad-men have told me loud and clear just how they feel about my enjoyment of a presentation every time I watch OTA TV. Its a game with them - just how far can they push me with relentless ads before I simply give up.

            So, while even 4K looks good, I am afraid of HDMI and too ignorant at this time to know how to work around the irritants rightsholders may force onto me.

            When I know how to undo what the handshaking businessmen have done to force me to waste time with unwanted ads - then I may get on board, but as for now, I feel I am only being manipulated to spend MY money to put THEIR noose on my neck. I see their setup, and I have other things I would rather spend my dollars on.

            --
            "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
          • (Score: 0) by Anonymous Coward on Monday April 18 2016, @11:05AM

            by Anonymous Coward on Monday April 18 2016, @11:05AM (#333694)

            Who will beneift? TV manufacturers struggling to find any means to get you to buy a new TV.

            And this is exactly the reason.

            The transition from SDTV to HDTV provided a very noticeable improvement in picture quality. That and the forced shutdown of SDTV broadcasting in favor of HDTV broadcasting helped generate a huge sales increase for the TV makers for a few years.

            Since then, well, for the majority of TV watchers, they just do not see the same striking difference between HDTV and 4k or 3d or HDR to justify replacing the TV that they just purchased not too many years ago. So from the TV makers perspective, they see a depression in sales of new TV sets.

            And because of that depression, they are desperate to find something, anything, that will convince people to buy like they did during the SD to HD transition. But no one's falling for the tricks. Hense the quick succession of successor tech: 3d, 4K, smart TV, and now HDR.

      • (Score: 2) by Celestial on Sunday April 17 2016, @05:20AM

        by Celestial (4891) on Sunday April 17 2016, @05:20AM (#333145) Journal

        I agree that the difference with 4K resolution is negligible. However, I do notice a difference with HDR and the 10 bit color-gamut. Those are what make the upgrade to new HDR sets and 4K Ultra HD Blu-Ray worthwhile, IMO.

      • (Score: 1) by Francis on Monday April 18 2016, @03:30AM

        by Francis (5544) on Monday April 18 2016, @03:30AM (#333560)

        Unlike 4K and probably 3D, HDR is something that people will notice. The human eye can handle a rather large range of values from very dark to very bright and often times at the same time. Having a TV that can be very dark during night scenes without losing detail is great for movies that are shot at night, especially horror movies.

        Considering how far digital imaging techniques have come over the last 15 years, it only makes sense that TVs and monitors be able to make more use of them than in the past. For films where having the typical dynamic range makes sense, you can still do that, but for most films that extra dynamic range is a god send.

        Same goes for the high frame rate movies. When you pan around at the standard frame rate, it tends to look jittery, but at double the frame rate, the movement is much more smooth and natural. People often times don't like it, but that's mainly because they aren't used to it.

    • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @03:45AM

      by Anonymous Coward on Sunday April 17 2016, @03:45AM (#333112)

      Your old CRT also has richer blacks and brighter colors. This is because it's shooting a fucking electron beam at the screen (and somewhat radiating your face, hence the static cling of the screen). The CRT pixels are generating light rather than blocking light as in LCD.

      Currently, Only plasma and OLED can be as good as CRT in the contrast department. However, modern displays are insanely expensive when they combine both OLED or plasma and the resolutions capable of matching or exceeding CRTs.

      • (Score: 2) by Tork on Sunday April 17 2016, @05:11AM

        by Tork (3914) on Sunday April 17 2016, @05:11AM (#333141)
        I've seen OLEDs at Best Buy, if they are better it's not by a huge amount.
        --
        Slashdolt Logic: "25 year old jokes about sharks and lasers are +5, Funny." 💩
        • (Score: 2) by termigator on Monday April 18 2016, @01:24AM

          by termigator (4271) on Monday April 18 2016, @01:24AM (#333508)

          That can be due to bad settings. Most chain electronic stores are horrible at showing the capabilities of high quality displays.

          • (Score: 2) by Tork on Monday April 18 2016, @01:33AM

            by Tork (3914) on Monday April 18 2016, @01:33AM (#333513)
            Perhaps, but even OLED phones and game consoles (Playstation Vita, for example) aren't even close to being night and day. Heck, the Vita's colors shifted depending on viewing angle.
            --
            Slashdolt Logic: "25 year old jokes about sharks and lasers are +5, Funny." 💩
    • (Score: 3, Informative) by Nollij on Sunday April 17 2016, @02:01PM

      by Nollij (4559) on Sunday April 17 2016, @02:01PM (#333257)

      If resolution is your only concern, there are plenty of displays out there.
      Newegg sells over 200 models with comparable or greater specs. [newegg.com]

      Your bigger concerns will be aspect ratio, picture quality, and response time. But I'm guessing some of the $1000+ 4K screens would fit the bill for you.

  • (Score: -1, Troll) by Anonymous Coward on Sunday April 17 2016, @02:28AM

    by Anonymous Coward on Sunday April 17 2016, @02:28AM (#333060)

    That means that talking is very quiet and explosions are very loud right? Making it impossible to watch a movie at an acceptable volume when anyone else is around is not a good thing.

    • (Score: 2) by takyon on Sunday April 17 2016, @02:31AM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday April 17 2016, @02:31AM (#333064) Journal

      HDR videos display millions more shades of colour and extra levels of brightness than normal ones, allowing images to look more realistic.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @02:50AM

        by Anonymous Coward on Sunday April 17 2016, @02:50AM (#333076)

        Well, it is hard to imagine colors annoying people.

      • (Score: 3, Interesting) by khchung on Sunday April 17 2016, @09:31AM

        by khchung (457) on Sunday April 17 2016, @09:31AM (#333190)

        So it means dark scenes become impossible to see unless you watch it in a dark room, while bright outdoors shots becomes so bright it would hurt your eyes even during the daytime? No thanks.

    • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @02:55AM

      by Anonymous Coward on Sunday April 17 2016, @02:55AM (#333080)

      protip: there's no talking in the 1812 Overture

      also, subtitles

      • (Score: 2) by c0lo on Sunday April 17 2016, @07:22AM

        by c0lo (156) on Sunday April 17 2016, @07:22AM (#333163) Journal

        also, subtitles

        Ah, yes, I almost forgot the subtitles... believe me, they look fantastic in HDR!!!

        (grin)

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0
    • (Score: 2) by fishybell on Sunday April 17 2016, @04:05AM

      by fishybell (3156) on Sunday April 17 2016, @04:05AM (#333117)

      You could always undo the audio range expansion with compression [bfccomputing.com].

  • (Score: 2) by bitstream on Sunday April 17 2016, @02:44AM

    by bitstream (6144) on Sunday April 17 2016, @02:44AM (#333072) Journal

    Perhaps one should ask what features that are worth having. >4k, HDR, 3D, virtual, etc. It all costs plenty of transfer capacity in bit/s. As long as you can't look at another angle on the TV and see "inside" without special glasses, there's something to work on..

    Tip for TV manufacturers:
      * Make an open CEC standard with standard ports.
      * Drop HDMI, it cost small vendors 5000 US$/year + 1 US$/unit cost. Choose DisplayPort or something sane.
      * Enable computers to send compressed audio/video like h.264 directly over 100BASE-T with an open standard and without first needing to fiddle with the remote control (or proprietary CEC).
      * Skip that "smart-TV" computer. Or at least secure it.
      * Include CEC on that 100BASE-T interface.
      * And secure all the stuff..plz!

  • (Score: 2) by opinionated_science on Sunday April 17 2016, @02:51AM

    by opinionated_science (4031) on Sunday April 17 2016, @02:51AM (#333078)

    We have been using HDR (or equiv) for years in the sciences - our microscopes produce images in 30 bit tiffs, though resolution has improved. Gimp can handle 30 bit images, which is handy for my AFM data!!

    • (Score: 2) by takyon on Sunday April 17 2016, @03:06AM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday April 17 2016, @03:06AM (#333087) Journal

      I'm not sure color depth is the same thing. For example, 10-bit color (~1.07 billion colors) has been around in consumer hardware and codecs for a while now (for example, HEVC Main 10 profile), but the Wikipedia article mentions HDR10 Media Profile, which combines HDR and 10-bit. If it adds some brightness channel, it would be more than 30 bits per pixel...

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by opinionated_science on Sunday April 17 2016, @03:26AM

        by opinionated_science (4031) on Sunday April 17 2016, @03:26AM (#333099)

        well "colour" is wavelengths so , yes it is equivalent. In microscopy the lamps (mercury etc) the filters set colour and luminance measured for each band. With laser confocal, you get narrow colour, very deep luminance as laser spread is so minimal at those distance. Electron microscopy is *phenomenally* precise, as you are counting electrons.....think about that!!! You can get individual elements from atoms under STEM - exceedingly cool.

        With AFM it is a mathematical description of force/current and other derivatives, which are pA sensitive (10^-12). By now, might even be 10^-15 A as I did this 5 years back, with equipment probably 5 years old!! So we have the TIFF format which is probably the only standard - every instrument has it own "special" junk format....

        Human eyes (for some biology) have very deep colour perception, but pretty lousy resolution which is probably why this has been late coming to market - linear resolution is just more of the same, but deeper (bit width) needs much greater increases in computational power.

        To get more information we often process many slices to produce a composite and exploit 3D/parallax. I fully expect the VR headsets to get updated at some point, though TV's are of course more widespread.

        The VR revolution might really help in scientific viz, and I say that as a flicky glasses veteran.....

        Please return to the regularly scheduled discussion :-) /geekasm

  • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @03:21AM

    by Anonymous Coward on Sunday April 17 2016, @03:21AM (#333096)

    I have a new, non-"SmartTV". I would assume this would cut down on data sent back to the mothership. (however the "cable box" is probably a different story!)

    I notice it has USB support, which I found interesting.

    My question is, what type of information is sent back? I know it has to send something back because of USB support. (I read on another forum about this)

    • (Score: 2) by takyon on Sunday April 17 2016, @03:28AM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday April 17 2016, @03:28AM (#333101) Journal

      Don't let it connect to the Internet, bro.

      Last time I set up a smart TV, getting it connected to Wi-Fi was a pain in the ass. But it didn't need a net connection to play files on connected USB sticks or external hard drives.

      If it's not connected to the Internet, it's not going to send anything back.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 1) by tftp on Sunday April 17 2016, @07:27AM

        by tftp (806) on Sunday April 17 2016, @07:27AM (#333166) Homepage
        How do you know that the tv is not connected to the Internet if it has Wi-Fi? Unencrypted, unprotected networks are often available.
        • (Score: 2) by Bot on Sunday April 17 2016, @09:45AM

          by Bot (3902) on Sunday April 17 2016, @09:45AM (#333198) Journal

          In fact on my wifi access points list I see a TV.

          --
          Account abandoned.
        • (Score: 2) by maxwell demon on Sunday April 17 2016, @09:50AM

          by maxwell demon (1608) Subscriber Badge on Sunday April 17 2016, @09:50AM (#333200) Journal

          Listen on those unprotected networks with TV on, and with TV off and disconnected from power. Any communication from your TV should be identifiable that way.

          --
          The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 2, Informative) by Anonymous Coward on Sunday April 17 2016, @03:38AM

    by Anonymous Coward on Sunday April 17 2016, @03:38AM (#333110)

    As a gamedev, I need to tell you about HDR. HDR isn't what TFA implies it is.

    HDR is a method of encoding a highly dynamic range of light values. Think of a picture. Instead of having RGB from 0-255, we have a floating point value for each R,G,B value. Now, let's say 0.0 is no saturation (0 in the 0-255) and 1.0 is full saturation (255 in the 0-255 range). Now, imagine that instead of capping the values at 1.0, we instead allow light sources contribute as much light as they want -- even having trillions above 1.0 as valid lighting values.

    In standard rendering the 0-255 range is a fixed mapping to the floating point 0 to 1 scale. However, in HDR the range is mapped to a dynamic function. So, in very low light settings you could have 0 being the min (0) and 0.25 being the max (255). We would create a function that takes the average saturation of the screen and applies a range around this. Then over time we adjust the scale of ranges to emulate what happens when your eyes adjust to the dark. You enter a dark corridor and everything appears black, but we then make the lower light levels map to 255 and you can see in the dark after the adjustment period. The same goes for walking out into the brightness. There the lighting could average 50.0 and so we might scale everything below 25 to an output of 0 and everything above 100 to output 255. When you first walk outside everything is too bright to distinguish from white, but then the range is dynamically adjusted to this higher value, and you can see.

    We can even set the visible intensity spectrum to a minimum / maximum range, and then enable "thermal vision" mode by setting the HDR to ramp up the non-visible range. Then we just have heat sources contribute to the lighting equation. To indicate the thermal vision mode we may force the output to gray-scale and then tint it green.

    However, it is important to note that all of these caluclations are performed BEFORE the data ever hits your screen. "HDR" is being used as a buzzword now to push new TVs with greater contrast ratios, but the reality is that your video player (or codec software) could perform the same HDR calculation I just told you about. Thus, summing the HDR output to a visible range of output. Furthermore, the HDR processing could be done in production of the video rather than realtime (since videos are not interactive games).

    The TV itself does not have a "highly dynamic range". No. It has a fixed range of contrast, brightness and pixel change latency. It's not like the pixels magically become capable of displaying brighter colors on demand. Their range is fixed. There is no such thing as a TV with High Dynamic Range. HDR is a video processing / rendering technique that allows you to record a large range of lighting values and then set your "f-stop" / exposure length on the fly later. It has nothing to do with hardware, but everything to do with software. HDR is a "muh graphics!" gaming buzzword (hint: so is "3D" and "VR"), but unlike "High Resolution" (or "high definition" HD), HDR has nothing to do with the physical properties of the screen.

    HDR may be a nicer screen, but there's no reason you can't view HDR content on a shitty old CRT -- The whole fucking point of HDR was to allow displays that aren't capable of being as bright as the sun to be EMULATED by dynamically adjusting the visible color range.

    TL;DR: Don't buy into HDR hype. It's a bogus marketing buzword applied incorrectly, and is as meaningless as "hypoallergenic" or "synergistic".

    • (Score: 1, Funny) by Anonymous Coward on Sunday April 17 2016, @03:51AM

      by Anonymous Coward on Sunday April 17 2016, @03:51AM (#333115)

      Amendment: HDR as used in TV marketing just refers to greater than 8bits of sensitivity per color channel. This means the STATIC (not dynamic) range of values displayed can be something other than 0-255 per channel. However, in moving content the human eye can not differentiate between individual values delineated in the 256 levels per component range (which is one reason why it's stuck around so long).

      • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @05:20AM

        by Anonymous Coward on Sunday April 17 2016, @05:20AM (#333144)

        No. You've got a lot wrong.

        HDR in televisions is more than just increased color precision. It is also increased color gamut, i.e. more colors beyond what non-HDR tvs can display. Related and in addition to that it also refers to increased contrast through increased brightness. For example a typical TV might have a maximum brightness of 100 nits while a top of the line tv with Dolby Vision would have a peak brightness of 4,000 nits.

        • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @05:36AM

          by Anonymous Coward on Sunday April 17 2016, @05:36AM (#333147)

          increased color precision. It is also increased color gamut

          No, you're being a retard. Precision = more bits. Gamut = more bits. It is the same.

          • (Score: 3, Informative) by maxwell demon on Sunday April 17 2016, @10:13AM

            by maxwell demon (1608) Subscriber Badge on Sunday April 17 2016, @10:13AM (#333210) Journal

            No, you're being a retard. Precision = more bits. Gamut = more bits. It is the same.

            You are the retard. You can have a greater gamut [wikipedia.org] with the same number of pixels, and the same gamut with a higher number of pixels. The gamut describes the range of colours you can display, while the precision describes how small variations of colours you can display.

            The gamut is related to the physical properties of the display. The highest gamut you'd get if you used three suitable monochromatic light sources for your pixels. Of course you can emulate a display of a strictly lower gamut with a display of a strictly higher gamut. However note that the gamuts naturally have only a partial order; in principle you could have a low-gamut display that can display some colours which a high-gamut display cannot display.

            Also, you probably still have the illusion that your RGB monitor can show all existing colours. It can't. Contrary to what you often read, you cannot generate all colours from just three base colours. In particular, no spectral colour can be mixed from other colours.

            The reason RGB monitors work anyway is that they can give you a sufficiently large range of colours that you can live with it. The remaining colours are mapped into the closest supported colour.

            --
            The Tao of math: The numbers you can count are not the real numbers.
            • (Score: 2) by maxwell demon on Sunday April 17 2016, @10:59AM

              by maxwell demon (1608) Subscriber Badge on Sunday April 17 2016, @10:59AM (#333219) Journal

              You can have a greater gamut with the same number of pixels, and the same gamut with a higher number of pixels.

              Here I of course meant "bits" instead of "pixels".

              --
              The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 0) by Anonymous Coward on Sunday April 17 2016, @10:04AM

        by Anonymous Coward on Sunday April 17 2016, @10:04AM (#333207)

        Ummm... Increase the Dynamic Range of your TV Today! Just send $15.95 plus $7.95 shipping and handling and we will show you how to adjust a magic parameter called "Contrast"!

        Gotta be true! It was said in front of a TV camera and through a Microphone!

  • (Score: 2, Interesting) by stretch611 on Sunday April 17 2016, @06:27AM

    by stretch611 (6199) on Sunday April 17 2016, @06:27AM (#333158)

    While 25 megabit connections are not difficult to find in many urban areas of the US, there is still a problem. We still have monopolies that will give you high speed then cap you at an abysmal 300GB/month. At 25 megabits/s, assuming it is used fully, your entire monthly bandwidth cap will be gone in a little over a single day. (25 megabits is approximately 270 GB/day) So even if you have the speed, you can only make limited use of the service.

    --
    Now with 5 covid vaccine shots/boosters altering my DNA :P
    • (Score: 2) by Celestial on Sunday April 17 2016, @07:10PM

      by Celestial (4891) on Sunday April 17 2016, @07:10PM (#333354) Journal

      Yeah, that's the big issue (at least in the urban U.S. areas), and why I will still use physical discs.

  • (Score: 1) by pTamok on Sunday April 17 2016, @11:05AM

    by pTamok (3042) on Sunday April 17 2016, @11:05AM (#333220)

    What would be good is an increase in frame rate, especially for video imagery of fast-moving things.

    https://en.wikipedia.org/wiki/Frame_rate [wikipedia.org]

    Most video currently has a frame rate of either 25 or 30 (near enough*) frames per second, but because most video is interlaced, has a field rate of twice that. Read the wikipedia article for more background.

    Increasing the frame rate is a way in which perceived picture quality can be improved materially for viewers - more so than increasing resolution. Widening the colour gamut is another way. https://en.wikipedia.org/wiki/Gamut [wikipedia.org]

    *It is actually 29.97, and reason for that is an interesting technical story:-
        http://glydeck.blogspot.no/2011/07/why-do-we-have-2997-frame-rates-and-not.html [blogspot.no] ; and
    for audio 'complications' :-
        https://www.gearslutz.com/board/post-production-forum/184636-definitive-explanation-29-97-23-98-timecode.html [gearslutz.com]

    • (Score: 1) by Type44Q on Sunday April 17 2016, @08:52PM

      by Type44Q (4347) on Sunday April 17 2016, @08:52PM (#333395)

      but because most video is interlaced

      No, it isn't.

  • (Score: 0) by Anonymous Coward on Monday April 18 2016, @06:58AM

    by Anonymous Coward on Monday April 18 2016, @06:58AM (#333635)

    This explains why my netflix is no longer working. Tech support is a dream, suggesting firmware with the television? What the Frell? So maybe time to just use YouTube. It works, on my TV. Or PirateBay, it works on my Bittorrent! Do these corps know they are screwing themselves? Each of these attempts to restrict access inexorably pushes all of us to the more convenient sources that they have intentionally made "illegal", but more and more, we read "illegal" as "not fucked up, and actually works". Got to love technology. No love for Netflix.