Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Monday February 06 2017, @01:08PM   Printer-friendly
from the mine-eyes-have-seen-the-glory dept.

A couple of weeks ago in news of someone coming out with 8K resolution televisions, I left a comment to the effect that I have a 4K TV, but there's no 4K content, so an 8K TV was a bit silly. Someone said they thought Netflix had a couple of 4K offerings.

I recently ran across news that I'll have 4K content in the nebulous future. The FCC [US Federal Communications Commission] is taking its first steps toward over the air 4K broadcasts. but it appears that it may be a while before I see it.

There's more about it here at CNet. But all three articles raise questions that aren't answered, primarily, what about bandwidth? It seems to me that without extremely tight lossy compression, it would take four times the bandwidth of 1080p. Will quality be much better than 1080p after they compress the signal?

How will they get around that? Will I lose some side channels? What do you folks have to say?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday February 06 2017, @01:45PM

    by Anonymous Coward on Monday February 06 2017, @01:45PM (#463403)

    I have no idea whether they do this, but a possible way to save bandwidth would be to use the fact that the eye's colour resolution is worse than its intensity resolution (well, actually the difference is in some processing steps later in the visual system, but in this context that difference doesn't matter).

    So one way to save bandwidth would be to send the intensity information in 2160p, but the colour information still in 1080p. Assuming currently all channels get the same treatment, this should mean a factor of only 2 instead of 4 (only one of three colour channels needs fourfold bandwidth). Of course the savings are reduced if they are already taking advantage of it in the 1080p encoding.

  • (Score: 2) by jcross on Monday February 06 2017, @02:18PM

    by jcross (4009) on Monday February 06 2017, @02:18PM (#463419)

    That's called chroma subsampling, and it's already being done. I think to get additional savings while moving up to 4K, we're going to need to come up with novel compression strategies, like converting frame bitmaps into vector graphics with procedural textures or something, so that it turns back into a bitmap only on playback. This would allow us to make use of the display's sharpness without necessarily including full detail for *everything* down to the level of pores and such, and without needing to broadcast a stream at the full resolution of the display. Consider how anti-aliasing makes vector art crisper as display density increases, even if no more detail is added to the vector paths. By analogy, this would mean we'd still get some advantage from an 8k display even with 4k content.

  • (Score: 2) by Wootery on Monday February 06 2017, @06:08PM

    by Wootery (2341) on Monday February 06 2017, @06:08PM (#463539)

    Compression experts are no doubt aware of this. A similar trick is to exploit the fact that our visual systems suck when it comes to shades of blue [nfggames.com], so that channel can be compressed more aggressively than red or, especially, green.

    • (Score: 2) by wonkey_monkey on Monday February 06 2017, @07:31PM

      by wonkey_monkey (279) on Monday February 06 2017, @07:31PM (#463582) Homepage

      There's no blue channel, as broadcast video is stored as YUV - a full resolution Y channel for "brightness" and two lower resolution channels which are sort of offsets from the brightness to the right colour. I'm not sure, but there might well be an intrinsic bias against such deep blues in that scheme (i.e. only a small range of UV numbers lead to deep blues)

      --
      systemd is Roko's Basilisk
      • (Score: 2) by Wootery on Tuesday February 07 2017, @09:03AM

        by Wootery (2341) on Tuesday February 07 2017, @09:03AM (#463952)

        I'd assume so. If they're not exploiting that peculiarity of our visual systems, they're missing an important trick, as that website I linked to demonstrates.

        Not sure how that could be mapped to a YUV-style colour space though, but I'd be very surprised if they're not doing anything to bias things, as you say.

        ( ...which is to say, I have nothing of value to add ;-P )

        • (Score: 2) by wonkey_monkey on Tuesday February 07 2017, @05:53PM

          by wonkey_monkey (279) on Tuesday February 07 2017, @05:53PM (#464169) Homepage

          Well, actually I think it is biased in the sense that when converting from RGB to YUV, B contributes the least to the calculation of Y. So the onus for showing blue is shifted to the lower resolution chroma channels.

          --
          systemd is Roko's Basilisk