A couple of weeks ago in news of someone coming out with 8K resolution televisions, I left a comment to the effect that I have a 4K TV, but there's no 4K content, so an 8K TV was a bit silly. Someone said they thought Netflix had a couple of 4K offerings.
I recently ran across news that I'll have 4K content in the nebulous future. The FCC [US Federal Communications Commission] is taking its first steps toward over the air 4K broadcasts. but it appears that it may be a while before I see it.
There's more about it here at CNet. But all three articles raise questions that aren't answered, primarily, what about bandwidth? It seems to me that without extremely tight lossy compression, it would take four times the bandwidth of 1080p. Will quality be much better than 1080p after they compress the signal?
How will they get around that? Will I lose some side channels? What do you folks have to say?
(Score: 2) by Wootery on Tuesday February 07 2017, @09:03AM
I'd assume so. If they're not exploiting that peculiarity of our visual systems, they're missing an important trick, as that website I linked to demonstrates.
Not sure how that could be mapped to a YUV-style colour space though, but I'd be very surprised if they're not doing anything to bias things, as you say.
( ...which is to say, I have nothing of value to add ;-P )
(Score: 2) by wonkey_monkey on Tuesday February 07 2017, @05:53PM
Well, actually I think it is biased in the sense that when converting from RGB to YUV, B contributes the least to the calculation of Y. So the onus for showing blue is shifted to the lower resolution chroma channels.
systemd is Roko's Basilisk