Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday January 07 2017, @05:01AM   Printer-friendly
from the looking-good dept.

The key specifications for the HDMI 2.1 standard have been announced:

The HDMI Forum on Wednesday announced key specifications of the HDMI 2.1 standard, which will be published in the second quarter. The new standard will increase link bandwidth to 48 Gbps and will enable support for up to 10K resolutions without compression, new color spaces with up to 16 bits per component, dynamic HDR, variable refresh rates for gaming applications as well as new audio formats

The most important feature that the HDMI 2.1 specification brings is massively increased bandwidth over predecessors. That additional bandwidth (48 Gbps over 18 Gbps, a bit more than what a USB-C cable is rated for) will enable longer-term evolution of displays and TVs, but will require the industry to adopt the new 48G cable, which will keep using the existing connectors (Type A, C and D) and will retain backwards compatibility with existing equipment (which probably means 8b/10b encoding and effective bandwidth of around 38 Gbps). The standard-length 48G cables (up to two meters) will use copper wires, but it remains to be seen what happens to long cables. It is noteworthy that while some of the new features that the HDMI 2.1 spec brings to the table require the new cable, others do not. As a result, some of the new features might be supported on some devices, whereas others might be not.

The increased bandwidth of HDMI 2.1's 48G cables will enable support of new UHD resolutions, including 4Kp120, 8Kp100/120, 10Kp100/120, and increased frame rates. It is no less important that increased bandwidth will enable support of the latest and upcoming color spaces, such as BT.2020 (Rec. 2020) with 10, 12, or even more advanced with 16 bits per color component and without compression. HDMI Forum does not say it explicitly, but the version 2.1 of their standard will also likely support the BT.2100 (Rec. 2100), which has a number of important enhancements to the BT.2020 when it comes to HDR. While HDMI 2.0 already supports BT.2020 and HDMI 2.0b adds support for HDR10 (through support for Hybrid Log-Gamma (HLG)), it only can transmit 10 and 12 bits per sample at 4Kp60 resolution. To support HDR at 8K, one will need HDMI 2.1.

10K resolution (5760p)? 16-bits per channel color (281,474,976,710,656 shades of grey)? It's necessary!


Original Submission

Related Stories

HDMI 2.1 Released 15 comments

The High-Definition Multimedia Interface 2.1 specification has been released. The total transmission bandwidth has been increased to 48 Gb/s from the 18 Gb/s of HDMI 2.0 (or a maximum data rate of 42.6̅ Gb/s from 14.4 Gb/s). The new data rate is effectively tripled to 128 Gb/s when using Display Stream Compression (DSC).

Using DSC, HDMI 2.1 cables can transmit 4K (3840×2160) @ 240 Hz, and 8K (7680×4320) as well as UW10K (10240×4320) at 120 Hz. Without DSC, you will be able to transmit 4K @ 120 Hz, 5K (5120×2880) @ 120 Hz, 8K @ 60 Hz, and UW10K @ 30 Hz. Keep in mind that color depth and chroma subsampling also affect the necessary data rate.

The specification also adds new features such as dynamic high-dynamic-range support (you read that right - the first "dynamic" refers to "dynamic metadata that allows for changes on a scene-by-scene or frame-by-frame basis"), Variable Refresh Rate, Quick Frame Transport, Quick Media Switching, and Auto Low-Latency Mode:

This new version of the HDMI specification also introduces an enhanced refresh rate that gamers will appreciate. VRR, or Variable Refresh Rate, reduces, or in some cases eliminates, lag for smoother gameplay, while Quick Frame Transport (QFT) reduces latency. Quick Media Switching, or QMS, reduces the amount of blank-screen wait time while switching media. HDMI 2.1 also includes Auto Low Latency Mode (ALLM), which automatically sets the ideal latency for the smoothest viewing experience.

Also at the HDMI Forum, AnandTech, Tom's Hardware, and The Verge.

Previously: HDMI 2.1 Announced


Original Submission

DisplayPort 2.0 Announced, Triples Bandwidth to ~77.4 Gbps for 8K Displays 11 comments

VESA Announces DisplayPort 2.0 Standard: Bandwidth For 8K Monitors & Beyond

While display interface standards are slow to move, at the same time their movement is inexorable: monitor resolutions continue to increase, as do refresh rates and color depths, requiring more and more bandwidth to carry signals for the next generation of monitors. Keeping pace with the demand for bandwidth, the DisplayPort standard, the cornerstone of PC display standards, has now been through several revisions since it was first launched over a decade ago. And now this morning the standard is taking its biggest leap yet with the release of the DisplayPort 2.0 specification. Set to offer nearly triple the available bandwidth of DisplayPort 1.4, the new revision of DisplayPort is almost moving a number of previously optional features into the core standard, creating what's in many ways a new baseline for the interface.

The big news here, of course, is raw bandwidth. The current versions of DisplayPort – 1.3 & 1.4 – offer up to 32.4 Gbps of bandwidth – or 25.9 Gbps after overhead – which is enough for a standard 16.7 million color (24-bit) monitor at up to 120Hz, or up to 98Hz for 1 billion+ (30-bit) monitors. This is a lot of bandwidth, but it still isn't enough for the coming generation of monitors, including the likes of Apple's new 6K Pro Display XDR monitor, and of course, 8K monitors. As a result, the need for more display interface bandwidth continues to grow, with these next-generation monitors set to be the tipping point. And all of this is something that the rival HDMI Forum has already prepared for with their own HDMI 2.1 standard.

DisplayPort 2.0, in turn, is shooting for 8K and above. Introducing not just one but a few different bitrate modes, the fastest mode in DisplayPort 2.0 will top out at 80 Gbps of raw bandwidth, about 2.5 times that of DisplayPort 1.3/1.4. Layered on that, DisplayPort 2.0 also introduces a more efficient coding scheme, resulting in much less coding overhead. As a result, the effective bandwidth of the new standard will peak at 77.4 Gbps, with at 2.98x the bandwidth of the previous standard is just a hair under a full trebling of available bandwidth.

Related: HDMI 2.1 Announced
Linux, Meet DisplayPort
HDMI 2.1 Released
VirtualLink Consortium Announces USB Type-C Specification for VR Headsets


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Insightful) by Deeo Kain on Saturday January 07 2017, @05:26AM

    by Deeo Kain (5848) on Saturday January 07 2017, @05:26AM (#450621)

    The world (virtual) will be more colourful.

  • (Score: 1, Insightful) by Anonymous Coward on Saturday January 07 2017, @08:01AM

    by Anonymous Coward on Saturday January 07 2017, @08:01AM (#450649)

    In my day 16-bit colour meant high-colour [wikipedia.org]: you had 65536 colours tops, probably five bits for each colour channel plus an extra bit which you might be able to give to one of the three channels (usually green). There used to be SVGA modes that used such a colour depth, usually on low-end cards without a lot of video memory, where you’d trade resolution for colour depth. 16 bits per colour channel should rather be called 48-bit colour.

    • (Score: 1, Funny) by Anonymous Coward on Saturday January 07 2017, @02:43PM

      by Anonymous Coward on Saturday January 07 2017, @02:43PM (#450727)

      Back in my day we had only 16 colors, and we liked it.

      • (Score: 0) by Anonymous Coward on Monday January 09 2017, @10:32PM

        by Anonymous Coward on Monday January 09 2017, @10:32PM (#451693)

        We only had 4 colors and we liked it! Back in my father's day they only had 2 colors on a CRT, or if you were very unlucky, or old, on a tele-typewriter.

        Quite frankly though, I just look at all these new changes and wonder if they are for the sake of the user, or for the sake of keeping income by forcing adoption of new technology.

        8bit to 16 bit color was a dramatic change. 16 to 32 bit made a noticable difference to scene shading. Honestly, HDR hasn't made nearly as much difference (in a positive manner anyways) as the double to quadruple increase in pixel density in the past 15ish years. And frankly if they included 8 or 16 bit color modes, I would certainly consider switching back to them if they doubled or tripled my framerate while allowing me to continue using older video cards without needing to purchase all new hardware, some of which may no longer support my older operating systems.

    • (Score: 2) by rleigh on Saturday January 07 2017, @03:18PM

      by rleigh (4887) on Saturday January 07 2017, @03:18PM (#450737) Homepage

      The common meaning seems to have shifted from bits-per-pixel to bits-per-sample. I imagine part of the reason is that packed pixel formats are no longer necessary or commonplace, so bits-per-pixel is less important. Certainly for the scientific and medical imaging fields I work in, bits-per-sample is pretty much always what is used. Also, when using bits-per-pixel it's also making assumptions about the sample count and layout unless separately specified; it might not always be RGB. "16-bit" could be 16-bit grey, or 8-bit grey+alpha, for example. Or a packed format. Using the bits-per-sample + sample order is specific and unambiguous.

  • (Score: 2) by shortscreen on Saturday January 07 2017, @10:03AM

    by shortscreen (2252) on Saturday January 07 2017, @10:03AM (#450674) Journal

    surely that's the most exciting part

  • (Score: 1, Touché) by Anonymous Coward on Saturday January 07 2017, @10:19AM

    by Anonymous Coward on Saturday January 07 2017, @10:19AM (#450679)

    <anonymous pedant>
    Actually 16 bits per channel is 65534 shades of grey.
    </anonymous pedant>

  • (Score: 0) by Anonymous Coward on Saturday January 07 2017, @01:44PM

    by Anonymous Coward on Saturday January 07 2017, @01:44PM (#450717)

    what more could you want from audio visual hardware

  • (Score: 2) by chewbacon on Saturday January 07 2017, @03:57PM

    by chewbacon (1032) on Saturday January 07 2017, @03:57PM (#450750)

    Just bought an already obsolete 4K tv.

    • (Score: 0) by Anonymous Coward on Saturday January 07 2017, @04:55PM

      by Anonymous Coward on Saturday January 07 2017, @04:55PM (#450766)

      Could be worse. I just bought a Sony 4K TV :(

      • (Score: 0) by Anonymous Coward on Sunday January 08 2017, @05:56PM

        by Anonymous Coward on Sunday January 08 2017, @05:56PM (#451103)

        you guys make me sick

        • (Score: 0) by Anonymous Coward on Sunday January 08 2017, @06:18PM

          by Anonymous Coward on Sunday January 08 2017, @06:18PM (#451114)

          i bought a 4k monitor for my PC!

          I don't understand why people thought 1920x1080 is better than 1920x1200. I've never had a 1080 monitor ever, and never will. yet people buy $800 video cards to run games at a resolution that is lower than what I was running windows 98 on. (and complain on forums that the game is poorly optimized because it runs badly anyway).

          That Nvidia called their recent flagship card "1080" was just another student admission slip for the eternal september we've entered.