Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday July 07 2020, @01:55AM   Printer-friendly
from the what-the-H.264? dept.

H.266/VVC Standard Finalized With ~50% Lower Size Compared To H.265

The Versatile Video Coding (VVC) standard is now firmed up as H.266 as the successor to H.265/HEVC.

[...] Fraunhofer won't be releasing H.266 encoding/decoding software until this autumn. It will be interesting to see meanwhile what open-source solutions materialize. Similarly, how H.266 ultimately stacks up against the royalty-free AV1.

Fraunhofer HHI is proud to present the new state-of-the-art in global video coding: H.266/VVC brings video transmission to new speeds

Through a reduction of data requirements, H.266/VVC makes video transmission in mobile networks (where data capacity is limited) more efficient. For instance, the previous standard H.265/HEVC requires ca. 10 gigabytes of data to transmit a 90-min UHD video. With this new technology, only 5 gigabytes of data are required to achieve the same quality. Because H.266/VVC was developed with ultra-high-resolution video content in mind, the new standard is particularly beneficial when streaming 4K or 8K videos on a flat screen TV. Furthermore, H.266/VVC is ideal for all types of moving images: from high-resolution 360° video panoramas to screen sharing contents.

Versatile Video Coding (VVC/H.266):

In October 2015, the MPEG and VCEG formed the Joint Video Exploration Team (JVET) to evaluate available compression technologies and study the requirements for a next-generation video compression standard. The new algorithms should have 30-50% better compression rate for the same perceptual quality, with support for lossless and subjectively lossless compression. It should support resolutions from 4K to 16K as well as 360° videos. VVC should support YCbCr 4:4:4, 4:2:2 and 4:2:0 with 10 to 16 bits per component, BT.2100 wide color gamut and high dynamic range (HDR) of more than 16 stops (with peak brightness of 1000, 4000 and 10000 nits), auxiliary channels (for depth, transparency, etc.), variable and fractional frame rates from 0 to 120 Hz, scalable video coding for temporal (frame rate), spatial (resolution), SNR, color gamut and dynamic range differences, stereo/multiview coding, panoramic formats, and still picture coding. Encoding complexity of several times (up to ten times) that of HEVC is expected, depending on the quality of the encoding algorithm (which is outside the scope of the standard). The decoding complexity is expected to be about twice that of HEVC.

See also: MPEG: What Happened?
Sisvel Announces AV1 Patent Pool


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Touché) by The Mighty Buzzard on Tuesday July 07 2020, @12:36PM (16 children)

    I can't really tell the difference between 1080p and 4K unless I'm sitting too close to the TV/monitor for my eyes to focus well. And framerates over 30fps on anything except video games is a guaranteed way to get me to refuse to watch it.

    --
    My rights don't end where your fear begins.
    Starting Score:    1  point
    Moderation   +1  
       Touché=1, Total=1
    Extra 'Touché' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by takyon on Tuesday July 07 2020, @01:30PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday July 07 2020, @01:30PM (#1017616) Journal

    16K 240Hz VR video

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by ledow on Tuesday July 07 2020, @01:48PM (14 children)

    by ledow (5567) on Tuesday July 07 2020, @01:48PM (#1017629) Homepage

    I still just buy SD movies. Save myself a few quid on each movie, honestly don't notice the difference.

    I can spot a stray pixel at a hundred yards. But a movie? Who cares?

    P.S. I don't have a TV and watch most of my movies on a 90" diagonal projected image. It's capable of 1080, but it rarely needs to bother - the menus are higher res than anything I watch via it. I watch my "TV" on the same device, or via a HD laptop, or on my HD smartphone - by tvHeadend on an RPi 4, which sends the MPEG streams to the device and just lets VLC / Kodi display them. There's no performance issue with HD or SD.

    I have the Freeview HD channels, I never use them. Not even for recording. I watch my TV on my phone abroad, or in my living room, or in a window in the corner of my laptop while I'm doing other things.

    But even when I choose, specifically, to watch anything at all: SD. I can't notice the difference, why tax the wifi / 4G / storage space / device processing when you can't tell?

    It's not that I couldn't SEE the difference if I really cared. It's that with any of the above setups, I can't notice it. So why bother?

    Still watch DVDs, too.

    I find the whole 4K/8K debacle hilarious. When are you going to stop, guys? I mean, seriously, when do you think that you'll not be able to improve resolution of colourspace on a household device. Given me a number. 8K? 16K? At what point will the tech be "good enough" so you can spend your money on something else and never have to worry about it again?

    For me, that happened with HD. I spent decades crying out for my 1024x768 and above SVGA monitors to be matched by the capabilities of a TV (which were horrendous to put a computer image on). When that happened... well, I was done. It's now possible to see the same product under "Computer monitors" as it is under "TVs". It's the same device, they just put a DVB-T tuner or whatever in (in some cases, it's quite literally the same product... sold as a "monitor with TV functionality" and a "TV with HDMI" that you can sell in two different parts of the store.).

    I have personally bought zero monitors in the last 20 years. My last monitor was, after 20 years of service, running the CCTV DVR for the very rare occasion I needed to review footage. That's just now streamed to my phone or connected to my projector.

    My gaming laptop is eight years old, I'm perfectly happy with it. I knock games down to a sensible resolution if they start to struggle. I admit I need a new laptop, but that's because of the demands of the game, not the resolution. If I bought a new one, I'd turn off all the AA and knock the res down to 1080p even if the laptop was capable of more. I'd rather have 60fps (or more likely double that with a decent laptop) consistent than 4K and have the framerate drop even a tad in busy scenes.

    • (Score: 3, Insightful) by The Mighty Buzzard on Tuesday July 07 2020, @02:03PM (7 children)

      I can see the benefit all the way up to 1080p but most of my stuff I drop down to 720p for the huge space savings and a perceptible but quite acceptable quality loss. TV shows I mostly archive in 480p though. They tend to come in huge quantities compared to movies, so further space savings is warranted.

      --
      My rights don't end where your fear begins.
      • (Score: 2) by martyb on Tuesday July 07 2020, @02:11PM (4 children)

        by martyb (76) Subscriber Badge on Tuesday July 07 2020, @02:11PM (#1017644) Journal

        If it's something I intend to re-watch years later, I'll go for the higher resolution. That way it'll still look ok on my 8K or 16K TV. Otherwise, space savings generally wins out.

        --
        Wit is intellect, dancing.
      • (Score: 0) by Anonymous Coward on Wednesday July 08 2020, @05:47PM (1 child)

        by Anonymous Coward on Wednesday July 08 2020, @05:47PM (#1018302)

        same here. tho for 45 minutish tv i try to find the x265 version @720p... for archieve.
        however i keep some "fancy and big-file-size" movies around to test so i can bitch and moan about tearing and stuttering on the linux forums ^_^

    • (Score: 3, Insightful) by takyon on Tuesday July 07 2020, @03:51PM (4 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday July 07 2020, @03:51PM (#1017720) Journal

      But even when I choose, specifically, to watch anything at all: SD. I can't notice the difference, why tax the wifi / 4G / storage space / device processing when you can't tell?

      It's not that I couldn't SEE the difference if I really cared. It's that with any of the above setups, I can't notice it. So why bother?

      Still watch DVDs, too.

      720p (better than DVD quality) is a good minimum. Treat yourself to 720p at least.

      I find the whole 4K/8K debacle hilarious.

      What's a debacle about 4K? You can get a 4K TV for as low as $200. Even if you don't replace a TV until it breaks, you'll end up with 4K at some point.

      When are you going to stop, guys? I mean, seriously, when do you think that you'll not be able to improve resolution of colourspace on a household device. Given me a number. 8K? 16K? At what point will the tech be "good enough" so you can spend your money on something else and never have to worry about it again?

      "32K" video resolution on a "16K" display [soylentnews.org] for VR headsets. Up to a 1000 Hz framerate [blurbusters.com] is desirable, maybe 1200 Hz just to make it cleanly divisible by 30/60/120/240.

      8K should be more than enough for most screen sizes [anandtech.com], but I guess 16K can be a thing if you are able to plaster an entire wall with modular panels [digitaltrends.com].

      A couple of notes on that:

      1. MicroLED seems to be the best display technology conceivable right now (OLED but better), but it needs greater pixel density than that before it can be used at smaller sizes. So you won't necessarily need a wall that large in your house for 16K.

      2. Price is calculated to be somewhere between $700,000 to $6 million for different versions of "The Wall". It's an early adopter technology demo. But if it involves very thin modular segments, I could see the full thing going all the way down to around $2,000, when the technology is mature and there is no-name brand competition. Bezel-less panels that can be snapped together will have the effect of a larger TV, but with fewer manufacturing defects. Thinner means less material is needed to manufacture, and it's easier to ship in small modules. The technology can be made flexible so it could be made more resilient to the point of being shatterproof during transit (plenty of people have ordered a TV shipped to their house and gotten a box of broken glass, the retailer or manufacturer has to pay for a replacement).

      It's now possible to see the same product under "Computer monitors" as it is under "TVs". It's the same device, they just put a DVB-T tuner or whatever in (in some cases, it's quite literally the same product... sold as a "monitor with TV functionality" and a "TV with HDMI" that you can sell in two different parts of the store.).

      I have personally bought zero monitors in the last 20 years. My last monitor was, after 20 years of service, running the CCTV DVR for the very rare occasion I needed to review footage. That's just now streamed to my phone or connected to my projector.

      All the specs should be reviewed, such as response time, color depth, etc. (some TVs are garbage) but in many cases TVs are just as good or better than monitors for desktop purposes, if you have the space for one on a desk. Gamers are partially responsible for the inflated monitor prices.

      If I bought a new one, I'd turn off all the AA and knock the res down to 1080p even if the laptop was capable of more. I'd rather have 60fps (or more likely double that with a decent laptop) consistent than 4K and have the framerate drop even a tad in busy scenes.

      Getting an emissive display with an "infinite" contrast ratio (OLED, MicroLED in the future) will probably have a greater impact on graphics than 4K/120+Hz.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Tuesday July 07 2020, @06:20PM (3 children)

        by Anonymous Coward on Tuesday July 07 2020, @06:20PM (#1017803)

        One issue with the wall, at least in its cinema applications, is that sound doesn't (yet) travel through an LCD screen. So the sounds are all from off to the sides and top. Which doesn't work well. And would be worse in a home vs the cinema since you're probably sitting closer to it.

        • (Score: 2) by takyon on Tuesday July 07 2020, @08:01PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday July 07 2020, @08:01PM (#1017846) Journal

          I don't know much about audio. My assumption is that you could just set up your 5.1/7.1 speakers wherever and use some sort of calibration software to tweak the sound output to adapt to the room's walls, furniture layout, and the speaker setup.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by ledow on Wednesday July 08 2020, @12:37PM (1 child)

          by ledow (5567) on Wednesday July 08 2020, @12:37PM (#1018159) Homepage

          Personally, I always find the fact that I'm looking at a screen but the "noise" is coming from behind me a bit odd anyway.

          It's not immersive - I can't turn and see the source of that noise, so my brain disconnects from the movie and hits reality again. The guy shooting ISN'T to my left... he's on the screen, or out of sight. And especially when the scenes change... that explosion was behind but now suddenly it's in front because the camera has cut. It's distorts the suspension of disbelief required.

          I've not understood anything past stereo sound and even that I could happily just have mono and not care for movies and TV shows. Maybe for music, because you're already listening to an "imaginary" orchestra or whatever and a bit of separation helps.

          But all these 8.2, sound bars, Dolby stuff... I never understood it, even in a cinema.

          Slight stereo, so the guy on the left of the image is slightly left in the speaker. That's about all you need. And you don't have stuff going over or under because the effects rarely work except in a cinema. And, shush... don't tell anyone... but stereo is all you've got on your phone, etc. anyway.

          It's also the cause of talking being drowned out under the music, and all kinds. Because it's so poorly downmixed and so reliant on you having a particular setup that most people just don't have.

          There comes a point with every such technology where I just say "Why?" and stop following anything past that. Video - HD. Audio - Stereo. Dimensions: 2 (3 is fun but we can't make it work properly, though VR is getting very promising). Strangely, that's the common-denominator format for most devices in existence at the moment, and has been unchanged for a long time.

          I'm honestly of the opinion that anything beyond that should be a "check this box to buy/download the 8.1 surround / 4K version" as an optional extra (whether free or not).

          • (Score: 0) by Anonymous Coward on Thursday July 09 2020, @12:37PM

            by Anonymous Coward on Thursday July 09 2020, @12:37PM (#1018625)

            If you get a newer receiver, it will have speaker autodetection, several preset configurations to use, and an option to calibrate (which you'll need equipment for that probably doesn't come with). Autodetection works pretty damn well.

    • (Score: 2) by Bot on Thursday July 09 2020, @11:23AM

      by Bot (3902) on Thursday July 09 2020, @11:23AM (#1018608) Journal

      I use a CRT for PAL up to DVD material. Deals with interlaced and poor dynamic range.
      PC LCD is good for HD to FHD, but older monitors have less saturation than newer and cellphones/tablets.
      On the TV FHD is enough, HD is indistinguishable if you go far enough from the screen.
      4k, I don't like the colors. On one hand the dynamic range and color depth is closer to reality. On the other hand FHD and its color space looks more filmic. Ditto 24 vs 60 hz.

      Most youtube I watch, I don't even bother going full screen.

      I will maybe capture 4k but output FHD for a long time. I will also soon try raw HD video, because it reminds me of the, personally speaking, unmatched quality of analog satellite on CRT.

      But, the elephant in the room is the licensing. Apparently mpegla or whoever have some right on any material WHO WAS ENCODED AT ONE POINT AS H264. Now, your cam does native encoding but only paid for a consumer license? I guess we have a problem. Maybe it will be a nothingburger like mp3, maybe not. Are you going to rely on a maybe if you are a pro? I'd rather explore VP8/9 or AV1.

       

      --
      Account abandoned.