DisplayPort Alt Mode 2.0 Spec Released: Defining Alt Mode for USB4
As the tech industry gears up for the launch of the new USB4 standard, a few more parts first need to fall into place. Along with the core specification itself, there is the matter of alternate modes, which add further functionality to USB Type-C host ports by allowing the data pins to be used to carry other types of signals. Keeping pace with the updates to USB4, some of the alt modes are being updated as well, and this process is starting with the granddaddy of them all: DisplayPort Alt Mode.
The very first USB-C alt mode, DisplayPort Alt Mode was introduced in 2014. By remapping the USB-C high speed data pins from USB data to DisplayPort data, it became possible to use a USB-C port as a DisplayPort video output, and in some cases even mix the two to get both USB 3.x signaling and DisplayPort signaling over the same cable. As a result of DisplayPort Alt Mode's release, the number of devices with video output has exploded, and in laptops especially, this has become the preferred mode for driving video outputs when a laptop doesn't include a dedicated HDMI port.
If you're willing to accept Display Stream Compression... New DisplayPort spec enables 16K video over USB-C
Previously: Forget USB 3.2: Thunderbolt 3 Will Become the Basis of USB 4
DisplayPort 2.0 Announced, Triples Bandwidth to ~77.4 Gbps for 8K Displays
Speed-Doubling USB4 is Ready -- Now We Just Have to Wait for Devices
(Score: 2) by Immerman on Saturday May 02 2020, @03:44AM
You're describing "HDR effects", one way to generate HDR datafrom non-HDR originals, usually intentionally used to get fanciful effects.
At its base though, HDR is really about how two different factors compare: the maximum difference between darkest and lightest part of the scene, and the smallest difference in brightness that can be displayed in the dark part of the scene.
I'm not 100% sure that HDR video formats follow this, but as I recall HDR image formats are essentially floating point rather than fixed-point, so that you can simultaneously have a wide range of brightness, and extremely small steps in brightness on dark objects.
The classic example is looking out from a dark cave into a brightly sunlit field. The field will be brilliantly bright enough to hurt your eyes if you look straight at it, while you'll also be able to see the surface details of the cave wall when looking directly at it. Without HDR you have a few options when capturing such an image:
1) Use a lot more bits per pixel than normal, so you can capture the full range of brighness without losing detail in the darkness
2) Try to do that without increasing the bpp, and get a lot of "banding" as a result, where gradual changes in brightness fall below the smallest representable brighness step - most especially visible in the dark areas, where detail is almost totally lost.
3) mute the brightness so that you can see the drawings clearly, but the bright areas are either very washed out, or far dimmer than they should be, depending on your strategy.
HDR formats are generally a compromised version of (1) and (2), using a few more bits per pixel, but also use a more non-linear "floating point" representation of values, so that the size of the smallest possible brightness step gets larger the brighter the point. Essentaily, you get a lot more fine discrimination of brightness in the darker areas of the image where that fine detail is important, while also being able to capture very much brighter areas in the same image, without dramatically increasing the amount of data needed per pixel.