Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 13 submissions in the queue.
posted by CoolHand on Saturday September 24 2016, @02:02AM   Printer-friendly
from the hehe-they-said-pissing-hehe dept.

Get ready to endlessly debate the value of "native 4K" on consoles

http://arstechnica.com/gaming/2016/09/microsoft-and-sonys-emerging-4k-pixel-pissing-contest/

Sony's PlayStation 4 Pro (launching in November) and Microsoft's Xbox One Scorpio (launching late next year) are giving the pixel-counters out there a new, 4K-sized battlefield to fight over. Now, Microsoft is drawing a line in the sand in that developing battle, with Microsoft Studios Publishing General Manager Shannon Loftis telling USA Today that "any games we're making that we're launching in the Scorpio time frame, we're making sure they can natively render at 4K."

The word "natively" is important there, because there has been a lot of wiggle room when it comes to talking about what constitutes a truly "4K" game these days. For instance, according to developers Ars has talked to, many if not most games designed for the PS4 Pro will be rendered with an internal framebuffer that's larger than that for a 1080p game, but significantly smaller than the full 3840×2160 pixels on a 4K screen (the exact resolution for any PS4 Pro game will depend largely on how the developer prioritizes the frame rate and the level of detail in the scene). While the PS4 Pro can and does output a full 4K signal, it seems that only games with exceedingly simple graphics will be able to render at that resolution natively.

-- submitted from IRC


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by physicsmajor on Saturday September 24 2016, @03:34AM

    by physicsmajor (1471) on Saturday September 24 2016, @03:34AM (#405836)

    So, you're joking, but you know what? There is a limit. And while it's above 360p, we've passed it.

    If you sit a normal distance from your television (i.e., at least 6 feet), and your screen is smaller than 50 inches or so, you cannot perceive anything more than 720p on that monitor. It's entirely wasted. If you stick your nose in it, you can, but not in a living room. Doesn't matter how good your vision is, that's the limit of resolution in the fovea.

    Computer monitors at a desk can perhaps make use of 2500x1400 or so. Once we start talking 4K, it's just stupid stats to waste power and tank your framerates to try and drive all those pixels you can't see anyhow. The one and only use of actual, displayed 4K resolution I could see is VR. It sure as hell isn't useful in your living room. That said, essentially the best antialiasing method in existence is rendering at a higher resolution and downsampling. There is truth to that, but it doesn't seem to be what this pissing contest is about.

    Don't even get me started on phone screen resolutions. If you are above 720p there, you are wasting your battery on a useless statistic for any smartphone use case.

    Starting Score:    1  point
    Moderation   +2  
       Insightful=1, Informative=1, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by blackhawk on Saturday September 24 2016, @06:45AM

    by blackhawk (5275) on Saturday September 24 2016, @06:45AM (#405869)

    I think you might be underestimating the figures a little. I sit about 7 feet away from a 55 inch TV which is a mid-range Sony 4k set.

    Watching content at various resolution for me is more like:

    * 480p - it's an awful blurry mess that my TVs smart scaling works it's hardest to make watchable
    * 576p - blurry and lacking definition
    * 720p - the absolute minimum resolution i can stand to watch, at this point it's still somewhat blurry and is lacking a good deal of the detail I crave
    * 1080p - the picture now looks pretty decent, and on high bitrate high dynamic range sources looks really quite good - i enjoy watching at this level of detail
    * 4K - I can see extra definition in the picture over 1080p, colours pop (HDR), grain is reduced, detail level is excellent

    At 4k you might be right, it's more than I need, but what I need and can perceive is above the 1080p and far far beyond the crappy 720p you think I can see. I can easily see the difference between 4k and 1080p content.

    The changes become more subtle above 1080p but are present. I probably don't need the entire 4k, but i'd rather have some unused bandwidth than sit there looking at a picture I know could be much better.

    • (Score: 2) by physicsmajor on Saturday September 24 2016, @01:39PM

      by physicsmajor (1471) on Saturday September 24 2016, @01:39PM (#405932)

      I strongly suspect that the reason for this is that TV manufacturers have no incentive to put good scaling chips in their monitors. They want you to believe that the best is the highest resolution, so deliberately cheap out (or sabotage in software) how lower resolution signals are upscaled and displayed.

      Try this sometime:

      - Connect a computer to one of your auxiliary inputs.
      - Download MPC-HC.
      - Play back a lower resolution recording (from a free Blender movie or otherwise) with the computer doing the upscaling, sending a native 4K native signal to the TV

      You're on the physics fence between being able to tell the difference between 720p and 1080p with that system. However, I think with that test you'll find no difference whatsoever between 1080p and 4K. I sit 3-4 feet from 22 inch monitors and there is no perceptible difference between 720p and 1080p.

      • (Score: 2) by blackhawk on Saturday September 24 2016, @02:42PM

        by blackhawk (5275) on Saturday September 24 2016, @02:42PM (#405940)

        I have a mid range Sony Bravia set which comes with a pretty impressive scaler. It's not one of the cheap brand things you get from Best Buy. I'll look into running that test, since I have my PC hooked up via a nice long HDMI cable, it should be easy enough to do.

        For me, what I'm mostly noticing 720P upwards is the lack of grain / noise / or a kind of high passed look to the output. It's softer, less gritty.

        BTW, the upstairs TV is a cheaper model (50 inch) that I believe only has 720p physical resolution but takes in 1080p signals. Looking at that screen from a foot or two away I can clearly see the individual pixels, and even at about 10 feet it is clearly a much worse picture when displaying the Kodi interface and during playback of media. The fonts on the Kodi interface appear smeared and blurry.

        My eyesight has been tested recently and it's close to perfect. Sitting 2 feet back from my 24 inch 1080p monitor I can make out the single pixel dot above the 'i' character quite clearly.

      • (Score: 2) by Scruffy Beard 2 on Saturday September 24 2016, @03:41PM

        by Scruffy Beard 2 (6030) on Saturday September 24 2016, @03:41PM (#405958)

        I suspect that 4k gives you higher dynamic range, not extra resolution.

        It is essentially the equivalent to dithering,

        Though 1080p on a 42" TV is definitely blurry at 1-2 feet (the viewing distance my brother uses due to vision problems).

        • (Score: 2) by blackhawk on Saturday September 24 2016, @05:48PM

          by blackhawk (5275) on Saturday September 24 2016, @05:48PM (#405992)

          The 4k Sony Bravia I am watching on does have high dynamic range and I can't recommend that enough. Anyone should be able to pick out the better saturation of colours and the wider range from black to white - particularly in dark scenes. I've watched a few series now with 1080p 10bit h265 encodes and they look glorious compared to previous versions. This is a combination of both the improved colour gamut and improvements in bitrate from the H265 algorithm (assuming the encoder selected a decent rate and didn't just try and crush a 45min show down to 230MB files - which does work pretty well oddly enough).

          Viewing distance and screen size and resolution all have to be taken into account at once when talking about this subject though. None of those figures alone or in conjunction with 1 other gives the true picture - and that is pixels / degree i.e. the number of pixles displayed / degree of viewing from the subject's eye.

          I actually sit a lot closer to my TV than most people, and a 55 inch screen is decently large, so I have no problem spotting the difference between 720p and 1080. Above 1080p it's harder, a lot harder and it comes more as a qualitative thing. It's not individual pixels you are seeing, but how they react as a whole.

          Textures become more defined, in particular hair and fur. Fresnel lighting which tends to flicker outrageously on lower resolutions will simmer down or be removed entirely. You are getting the benefits you'd normally expect from anti-aliasing, but without any of the downsides because the extra resolution is actually there. I can see pore detail on skin better, and individual flyaway hairs on people's hair stand out well at 1080p and above. They are present at 720p but less distinct.

          You will notice the extra detail when the camera pans, as flicker that would be present in 1080p and lower content is vastly reduced at 4k or not present at all.

          I work in game development, so looking for that extra detail is pretty much in my job description. I know other people who can't see the difference, but I can, so it's totally worth the extra cost for me.

      • (Score: 2) by blackhawk on Sunday September 25 2016, @06:03AM

        by blackhawk (5275) on Sunday September 25 2016, @06:03AM (#406180)

        So I ran the test requested by physicsmajor and can offer you my empirical results.

        Setup:

        * The TV set is a Sony Bravia 55 inch
        * I'm a late forties aged programmer who works with games and media
        * My head is about 6.5 - 7.0 feet from the TV set
        * Media tested was Sintel from the Blender Project H264 versions at 720p, 1080p and 4k
        * I used an HDMI 4k / 3D capable cable from my PC to the TV
        * all tests ran native at 4K on the PC
        * VLC decoder was used in fullscreen mode (MPC was being a twat about resizing the playback image to the window size)

        The test media is provided at quite a decent bitrate - higher than is normal for downloaded media, so much of the image quality issues I dislike at 720p are no longer present. While a 720p scene release tends to be a bit "smeary" and of course a little flat in the colour range, this wasn't an issue with the Sintel media.

        At 720p the picture looked fine. It was quite watchable and unless you were looking for it you might not realise you weren't seeing a 1080p picture. But this fineness came at the cost of the high frequency detail, which was less present in the image. The opening minute was almost the same no matter the resolution because it's a snow / black rock scene with almost no high frequency detail. Oddly, when I watched it in 4k I thought I saw it was snowing for the first time. I'll have to go back and check the others now to see if it really was snowing.

        The details I was looking for in particular are:

        * the cloth in the main character's bra-top
        * specular highlights off the walls in the city, the large tree, the walls of the cave, the end dragon, etc
        * increased detail in areas where specular highlights typically occur

        In the 720p presentation, a lot of the areas that show as "glinting" or "flashing" in 1080p are much softer and appear more as a grey than a harsh flashing between deep greys and bright whites. The areas being most affected are those with high frequency details that is specular in nature. Overall, the film was watchable and you might not feel like you are missing any detail, especially if you didn't know it was there to begin with.

        Once I watched the 1080p presentation I saw those areas mentioned appear as textured zones and they started to noticeably flash as the fresnel changed (angle at which you are viewing them combined with a sharp incidence on reflected light). I noticed some areas now appear more detailed. Some parts of the leather armour from the fight scene have just slightly better looking texture, but it's very subtle. Her shoulderpad now shows glinting as it shifts in the light and is quite noisy, as is her top.

        I went back and viewed both the 720p and 1080p again to confirm I had correctly identified differences in the picture. Having done so I can confirm I can definitely see them differences and it wasn't that hard to tell which was which. I can usually tell when I am watching 720p rather than 1080p on the shows I watch, so this is just further confirmation of that.

        With the baseline established it was time to check if I could really see anything above the 1080p mark. Note: it doesn't matter one iota if I can only see 1200p or 1440p instead of the full 4k since TVs of those resolutions aren't commonly for sale. All that matters if I can see an improvement in picture from 1080p to 4k.

        I fired up the test once more. This time, like the last - no real difference for the first minute or so - except, was that snow just then? When we reach the city I start to notice the flashing off her bra and shoulderpad are seriously reduced and instead more detail is apparent. Background scenery is almost indistinguishable from 1080p but it does look like there's more "grain" or "grit" to the picture. When the dragon flies over the tower the vast swathes of screen flicker from 1080p are now gone or almost gone. This is from the fresnel and specular components on noisy surfaces. Bra and shoulder look great, detailed, and no flicker.

        The tree is an area of very high noise and light variation. It does still flicker, but the area of flickering is smaller and the intensity is reduced.

        The same goes for the inside of the cave, which is now beautifully detailed with fine filigree visible from specular reflections. Again, same thing for the dragon. This scene in here will be the easiest place for anyone to see the difference that 4k brings.

        The end result is that I can see the difference between 720p, 1080p and 4k. After 1080p it becomes a lot more subtle and you have to know what you're looking for to see it. It's mostly found in high frequency specular areas, and particularly ones with a high gloss and albedo difference.

        I can watch a high bitrate 720p and be happy with it, and I do for material that I don't mind a little loss of detail. I watch most of my stuff in 1080p, it's good enough for almost anything I want to watch. In many scenes it would be difficult to even see the difference from 1080p to 4k. There's this small amount of content I really want to see every little detail possible and have a truly stunning picture, and that's what I get in 4k. I can see the individual strands in the weave of a dark suit. Every little flyaway hair on the actor's heads is sticking out clearly. Pores are very clear to see, along with any blemishes on the skin. Metals show tiny detail like pit marks. Every little detail is there to see.

        Is 4k over the top and beyond our physical ability to see? You mentioned the limit on the fovea, and while this is true I don't think it's the whole picture. Being able to see an individual pixel isn't the limit, it's how those pixels interact with the ones around them that matters as well. Vision is also not a frame based / pixel based thing, it's continuous, so I think we gain a little more "resolution" from being able to take x number of snapshots of each frame from microscopically different views i.e. we might actually register a pixel shown at 30FPS 10 times in our brain wetware, each ever so slightly differently, and that's used to build the internal picture of the view.

        Anyway, bottom line is I can see the differences in the picture. I believe most could, if they knew what they were looking for, and what was actually missing from the picture they are viewing. 1080p is almost certainly good enough for the majority, but I love watching shows, and it's totally worth it to me to pay a bit more to get the picture quality of my choice.

    • (Score: 2) by ledow on Saturday September 24 2016, @04:46PM

      by ledow (5567) on Saturday September 24 2016, @04:46PM (#405972) Homepage

      I was absolutely IN AWE of the resolution of my 1024x768 monitor when I put a WinTV card in it and watched bog-standard, pre-HD analog video (which would have been PAL in my area) on it.

      It was crisp, sharp, amazingly clear, even when "scaled".

      My eyesight hasn't improved in the 20 years since then, so I see no need for even HD.

      But it was ALWAYS a different matter for TV. No matter how many gadgets I had to put the signal into the TV, no amount of VGA or component video convertors to standard TV resolutions would work nicely for using them until we started getting HD TV's.

      The reason I have a HD laptop is for the word processing, the web-browsing and the text. The TV is now capable of taking that signal and reproducing it (because it's nothing more than a large LCD monitor nowadays).

      Beyond that? You're just pissing away processing power. I buy all my online content in SD and never notice. For moving scenes, I honestly can't spot a difference. And nor can anybody who comes to my house and tries to do so fairly (my TV is pretty non-descript but if you Google the model number or stick your face so close you can see RGB elements, that's just cheating).

      At any sensible distance, for any sensible screen size (32" for my TV, 17" for my laptop), SD is more than good enough for most people. HD collects all the outliers who can see things that others can't or want to convince themselves.

      4K is just an horrendous waste of money. Especially if - when you play it on a HD screen - it isn't optimised enough to get you full FPS. I guarantee you that some games will be "4K only" and you won't be able to dial them down.

      There was a point where analog->digital, VGA->HDMI, SD->HD, 25fps->60fps, etc. made a difference that you could see. Those days are long-gone.

      I projected a VGA image to a 200" projected screen recently. Nobody even noticed or cared. And they were doing an art project with a very particular artist who was fussing about making it look exactly right. Nobody asked for higher resolutions, or HDMI or anything, even though they were available. If you can't spot it or care on a 200" screen for a recorded art performance, you aren't going to care about 4K vs HD on your 32" bedroom screen for the kids to play games on, when they're sitting feet away from it.

      If you can truly spot the difference at a normal viewing distance on ALL hardware (Sony and big-end things tend to be WORSE for this, I've compared them to old CRTs and even cheap Samsung junk and the cheap stuff does Blu-Ray, DVD and even upscaling better - Sony really want you to think your wasted money was worthwhile), then I pity you. Your digital media life is always going to be expensive and suck for you.

      But I would pity you even more if I thought that - 20 years ago - you never once complained about the resolution still being bad even after you went from SD->HD or analog->digital, etc. because, actually, you can't tell and are just trying to justify your expensive and unnecessary purchases.

      • (Score: 2) by blackhawk on Saturday September 24 2016, @06:01PM

        by blackhawk (5275) on Saturday September 24 2016, @06:01PM (#405999)

        I'm sitting a little over 6 feet from a 55 inch TV, so my viewing experience isn't the same as yours sitting god knows how far from a 32 inch screen. The screen is quite probably larger in my field of view than you are used to.

        I'm working on writing a game now, so I have a few opinions on 4k gaming, and that's mainly that it isn't going to be worth a damn for 4+ years. We need to wait a while for the high end graphics cards to catch up to what the monitors and TVs can output. It puts a tremendous strain on the CPU / GPU and even memory architecture to push all that data out the HDMI connector. If you can spend $1000USD on a video card or get a pair of 980s then you can play in 4k now, but for most it is out of reach. I'd rather hit 60+ FPS at a stable rate than worry about 4K for gaming. And before you ask, yes I can damn well see when a game is running above 30FPS, it's jittery and janky as hell anything below about 55FPS for me. 60FPS is usually fine, but for VR I don't feel quite right unless it's 90FPS.

        I honestly wouldn't have bought a 4K TV at this time. I had a perfectly good Sony Bravia which was 9 years old and had a great picture at 1080p. It eventually burnt out the capacitor on the backlight (I believe) and would no longer show anything more than a blank screen when starting up.

        Given I had to replace the TV the choice came down to saving a few bucks now and getting a 1080p, or future proofing and getting a TV I will enjoy for the next 5+ years. The 4K TV has greater colour gamut, improved motion handling, improved black levels and a better picture in so many ways over the cheaper sets - it wasn't hard to decide to drop the money and get a TV I would still want to use several years from now when 4k content starts to come out.

  • (Score: 1, Insightful) by Anonymous Coward on Saturday September 24 2016, @05:06PM

    by Anonymous Coward on Saturday September 24 2016, @05:06PM (#405977)

    4k is for programmers. [tsotech.com] I use a 4k, 39" TV as a computer monitor at 2-3 ft distance. With the higher resolution, new usage patterns become possible.