Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday January 07 2017, @05:01AM   Printer-friendly
from the looking-good dept.

The key specifications for the HDMI 2.1 standard have been announced:

The HDMI Forum on Wednesday announced key specifications of the HDMI 2.1 standard, which will be published in the second quarter. The new standard will increase link bandwidth to 48 Gbps and will enable support for up to 10K resolutions without compression, new color spaces with up to 16 bits per component, dynamic HDR, variable refresh rates for gaming applications as well as new audio formats

The most important feature that the HDMI 2.1 specification brings is massively increased bandwidth over predecessors. That additional bandwidth (48 Gbps over 18 Gbps, a bit more than what a USB-C cable is rated for) will enable longer-term evolution of displays and TVs, but will require the industry to adopt the new 48G cable, which will keep using the existing connectors (Type A, C and D) and will retain backwards compatibility with existing equipment (which probably means 8b/10b encoding and effective bandwidth of around 38 Gbps). The standard-length 48G cables (up to two meters) will use copper wires, but it remains to be seen what happens to long cables. It is noteworthy that while some of the new features that the HDMI 2.1 spec brings to the table require the new cable, others do not. As a result, some of the new features might be supported on some devices, whereas others might be not.

The increased bandwidth of HDMI 2.1's 48G cables will enable support of new UHD resolutions, including 4Kp120, 8Kp100/120, 10Kp100/120, and increased frame rates. It is no less important that increased bandwidth will enable support of the latest and upcoming color spaces, such as BT.2020 (Rec. 2020) with 10, 12, or even more advanced with 16 bits per color component and without compression. HDMI Forum does not say it explicitly, but the version 2.1 of their standard will also likely support the BT.2100 (Rec. 2100), which has a number of important enhancements to the BT.2020 when it comes to HDR. While HDMI 2.0 already supports BT.2020 and HDMI 2.0b adds support for HDR10 (through support for Hybrid Log-Gamma (HLG)), it only can transmit 10 and 12 bits per sample at 4Kp60 resolution. To support HDR at 8K, one will need HDMI 2.1.

10K resolution (5760p)? 16-bits per channel color (281,474,976,710,656 shades of grey)? It's necessary!


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Insightful) by Deeo Kain on Saturday January 07 2017, @05:26AM

    by Deeo Kain (5848) on Saturday January 07 2017, @05:26AM (#450621)

    The world (virtual) will be more colourful.

  • (Score: 1, Insightful) by Anonymous Coward on Saturday January 07 2017, @08:01AM

    by Anonymous Coward on Saturday January 07 2017, @08:01AM (#450649)

    In my day 16-bit colour meant high-colour [wikipedia.org]: you had 65536 colours tops, probably five bits for each colour channel plus an extra bit which you might be able to give to one of the three channels (usually green). There used to be SVGA modes that used such a colour depth, usually on low-end cards without a lot of video memory, where you’d trade resolution for colour depth. 16 bits per colour channel should rather be called 48-bit colour.

    • (Score: 1, Funny) by Anonymous Coward on Saturday January 07 2017, @02:43PM

      by Anonymous Coward on Saturday January 07 2017, @02:43PM (#450727)

      Back in my day we had only 16 colors, and we liked it.

      • (Score: 0) by Anonymous Coward on Monday January 09 2017, @10:32PM

        by Anonymous Coward on Monday January 09 2017, @10:32PM (#451693)

        We only had 4 colors and we liked it! Back in my father's day they only had 2 colors on a CRT, or if you were very unlucky, or old, on a tele-typewriter.

        Quite frankly though, I just look at all these new changes and wonder if they are for the sake of the user, or for the sake of keeping income by forcing adoption of new technology.

        8bit to 16 bit color was a dramatic change. 16 to 32 bit made a noticable difference to scene shading. Honestly, HDR hasn't made nearly as much difference (in a positive manner anyways) as the double to quadruple increase in pixel density in the past 15ish years. And frankly if they included 8 or 16 bit color modes, I would certainly consider switching back to them if they doubled or tripled my framerate while allowing me to continue using older video cards without needing to purchase all new hardware, some of which may no longer support my older operating systems.

    • (Score: 2) by rleigh on Saturday January 07 2017, @03:18PM

      by rleigh (4887) on Saturday January 07 2017, @03:18PM (#450737) Homepage

      The common meaning seems to have shifted from bits-per-pixel to bits-per-sample. I imagine part of the reason is that packed pixel formats are no longer necessary or commonplace, so bits-per-pixel is less important. Certainly for the scientific and medical imaging fields I work in, bits-per-sample is pretty much always what is used. Also, when using bits-per-pixel it's also making assumptions about the sample count and layout unless separately specified; it might not always be RGB. "16-bit" could be 16-bit grey, or 8-bit grey+alpha, for example. Or a packed format. Using the bits-per-sample + sample order is specific and unambiguous.

  • (Score: 2) by shortscreen on Saturday January 07 2017, @10:03AM

    by shortscreen (2252) on Saturday January 07 2017, @10:03AM (#450674) Journal

    surely that's the most exciting part

  • (Score: 1, Touché) by Anonymous Coward on Saturday January 07 2017, @10:19AM

    by Anonymous Coward on Saturday January 07 2017, @10:19AM (#450679)

    <anonymous pedant>
    Actually 16 bits per channel is 65534 shades of grey.
    </anonymous pedant>

  • (Score: 0) by Anonymous Coward on Saturday January 07 2017, @01:44PM

    by Anonymous Coward on Saturday January 07 2017, @01:44PM (#450717)

    what more could you want from audio visual hardware

  • (Score: 2) by chewbacon on Saturday January 07 2017, @03:57PM

    by chewbacon (1032) on Saturday January 07 2017, @03:57PM (#450750)

    Just bought an already obsolete 4K tv.

    • (Score: 0) by Anonymous Coward on Saturday January 07 2017, @04:55PM

      by Anonymous Coward on Saturday January 07 2017, @04:55PM (#450766)

      Could be worse. I just bought a Sony 4K TV :(

      • (Score: 0) by Anonymous Coward on Sunday January 08 2017, @05:56PM

        by Anonymous Coward on Sunday January 08 2017, @05:56PM (#451103)

        you guys make me sick

        • (Score: 0) by Anonymous Coward on Sunday January 08 2017, @06:18PM

          by Anonymous Coward on Sunday January 08 2017, @06:18PM (#451114)

          i bought a 4k monitor for my PC!

          I don't understand why people thought 1920x1080 is better than 1920x1200. I've never had a 1080 monitor ever, and never will. yet people buy $800 video cards to run games at a resolution that is lower than what I was running windows 98 on. (and complain on forums that the game is poorly optimized because it runs badly anyway).

          That Nvidia called their recent flagship card "1080" was just another student admission slip for the eternal september we've entered.