Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Saturday September 24 2016, @02:02AM   Printer-friendly
from the hehe-they-said-pissing-hehe dept.

Get ready to endlessly debate the value of "native 4K" on consoles

http://arstechnica.com/gaming/2016/09/microsoft-and-sonys-emerging-4k-pixel-pissing-contest/

Sony's PlayStation 4 Pro (launching in November) and Microsoft's Xbox One Scorpio (launching late next year) are giving the pixel-counters out there a new, 4K-sized battlefield to fight over. Now, Microsoft is drawing a line in the sand in that developing battle, with Microsoft Studios Publishing General Manager Shannon Loftis telling USA Today that "any games we're making that we're launching in the Scorpio time frame, we're making sure they can natively render at 4K."

The word "natively" is important there, because there has been a lot of wiggle room when it comes to talking about what constitutes a truly "4K" game these days. For instance, according to developers Ars has talked to, many if not most games designed for the PS4 Pro will be rendered with an internal framebuffer that's larger than that for a 1080p game, but significantly smaller than the full 3840×2160 pixels on a 4K screen (the exact resolution for any PS4 Pro game will depend largely on how the developer prioritizes the frame rate and the level of detail in the scene). While the PS4 Pro can and does output a full 4K signal, it seems that only games with exceedingly simple graphics will be able to render at that resolution natively.

-- submitted from IRC


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by physicsmajor on Saturday September 24 2016, @01:39PM

    by physicsmajor (1471) on Saturday September 24 2016, @01:39PM (#405932)

    I strongly suspect that the reason for this is that TV manufacturers have no incentive to put good scaling chips in their monitors. They want you to believe that the best is the highest resolution, so deliberately cheap out (or sabotage in software) how lower resolution signals are upscaled and displayed.

    Try this sometime:

    - Connect a computer to one of your auxiliary inputs.
    - Download MPC-HC.
    - Play back a lower resolution recording (from a free Blender movie or otherwise) with the computer doing the upscaling, sending a native 4K native signal to the TV

    You're on the physics fence between being able to tell the difference between 720p and 1080p with that system. However, I think with that test you'll find no difference whatsoever between 1080p and 4K. I sit 3-4 feet from 22 inch monitors and there is no perceptible difference between 720p and 1080p.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by blackhawk on Saturday September 24 2016, @02:42PM

    by blackhawk (5275) on Saturday September 24 2016, @02:42PM (#405940)

    I have a mid range Sony Bravia set which comes with a pretty impressive scaler. It's not one of the cheap brand things you get from Best Buy. I'll look into running that test, since I have my PC hooked up via a nice long HDMI cable, it should be easy enough to do.

    For me, what I'm mostly noticing 720P upwards is the lack of grain / noise / or a kind of high passed look to the output. It's softer, less gritty.

    BTW, the upstairs TV is a cheaper model (50 inch) that I believe only has 720p physical resolution but takes in 1080p signals. Looking at that screen from a foot or two away I can clearly see the individual pixels, and even at about 10 feet it is clearly a much worse picture when displaying the Kodi interface and during playback of media. The fonts on the Kodi interface appear smeared and blurry.

    My eyesight has been tested recently and it's close to perfect. Sitting 2 feet back from my 24 inch 1080p monitor I can make out the single pixel dot above the 'i' character quite clearly.

  • (Score: 2) by Scruffy Beard 2 on Saturday September 24 2016, @03:41PM

    by Scruffy Beard 2 (6030) on Saturday September 24 2016, @03:41PM (#405958)

    I suspect that 4k gives you higher dynamic range, not extra resolution.

    It is essentially the equivalent to dithering,

    Though 1080p on a 42" TV is definitely blurry at 1-2 feet (the viewing distance my brother uses due to vision problems).

    • (Score: 2) by blackhawk on Saturday September 24 2016, @05:48PM

      by blackhawk (5275) on Saturday September 24 2016, @05:48PM (#405992)

      The 4k Sony Bravia I am watching on does have high dynamic range and I can't recommend that enough. Anyone should be able to pick out the better saturation of colours and the wider range from black to white - particularly in dark scenes. I've watched a few series now with 1080p 10bit h265 encodes and they look glorious compared to previous versions. This is a combination of both the improved colour gamut and improvements in bitrate from the H265 algorithm (assuming the encoder selected a decent rate and didn't just try and crush a 45min show down to 230MB files - which does work pretty well oddly enough).

      Viewing distance and screen size and resolution all have to be taken into account at once when talking about this subject though. None of those figures alone or in conjunction with 1 other gives the true picture - and that is pixels / degree i.e. the number of pixles displayed / degree of viewing from the subject's eye.

      I actually sit a lot closer to my TV than most people, and a 55 inch screen is decently large, so I have no problem spotting the difference between 720p and 1080. Above 1080p it's harder, a lot harder and it comes more as a qualitative thing. It's not individual pixels you are seeing, but how they react as a whole.

      Textures become more defined, in particular hair and fur. Fresnel lighting which tends to flicker outrageously on lower resolutions will simmer down or be removed entirely. You are getting the benefits you'd normally expect from anti-aliasing, but without any of the downsides because the extra resolution is actually there. I can see pore detail on skin better, and individual flyaway hairs on people's hair stand out well at 1080p and above. They are present at 720p but less distinct.

      You will notice the extra detail when the camera pans, as flicker that would be present in 1080p and lower content is vastly reduced at 4k or not present at all.

      I work in game development, so looking for that extra detail is pretty much in my job description. I know other people who can't see the difference, but I can, so it's totally worth the extra cost for me.

  • (Score: 2) by blackhawk on Sunday September 25 2016, @06:03AM

    by blackhawk (5275) on Sunday September 25 2016, @06:03AM (#406180)

    So I ran the test requested by physicsmajor and can offer you my empirical results.

    Setup:

    * The TV set is a Sony Bravia 55 inch
    * I'm a late forties aged programmer who works with games and media
    * My head is about 6.5 - 7.0 feet from the TV set
    * Media tested was Sintel from the Blender Project H264 versions at 720p, 1080p and 4k
    * I used an HDMI 4k / 3D capable cable from my PC to the TV
    * all tests ran native at 4K on the PC
    * VLC decoder was used in fullscreen mode (MPC was being a twat about resizing the playback image to the window size)

    The test media is provided at quite a decent bitrate - higher than is normal for downloaded media, so much of the image quality issues I dislike at 720p are no longer present. While a 720p scene release tends to be a bit "smeary" and of course a little flat in the colour range, this wasn't an issue with the Sintel media.

    At 720p the picture looked fine. It was quite watchable and unless you were looking for it you might not realise you weren't seeing a 1080p picture. But this fineness came at the cost of the high frequency detail, which was less present in the image. The opening minute was almost the same no matter the resolution because it's a snow / black rock scene with almost no high frequency detail. Oddly, when I watched it in 4k I thought I saw it was snowing for the first time. I'll have to go back and check the others now to see if it really was snowing.

    The details I was looking for in particular are:

    * the cloth in the main character's bra-top
    * specular highlights off the walls in the city, the large tree, the walls of the cave, the end dragon, etc
    * increased detail in areas where specular highlights typically occur

    In the 720p presentation, a lot of the areas that show as "glinting" or "flashing" in 1080p are much softer and appear more as a grey than a harsh flashing between deep greys and bright whites. The areas being most affected are those with high frequency details that is specular in nature. Overall, the film was watchable and you might not feel like you are missing any detail, especially if you didn't know it was there to begin with.

    Once I watched the 1080p presentation I saw those areas mentioned appear as textured zones and they started to noticeably flash as the fresnel changed (angle at which you are viewing them combined with a sharp incidence on reflected light). I noticed some areas now appear more detailed. Some parts of the leather armour from the fight scene have just slightly better looking texture, but it's very subtle. Her shoulderpad now shows glinting as it shifts in the light and is quite noisy, as is her top.

    I went back and viewed both the 720p and 1080p again to confirm I had correctly identified differences in the picture. Having done so I can confirm I can definitely see them differences and it wasn't that hard to tell which was which. I can usually tell when I am watching 720p rather than 1080p on the shows I watch, so this is just further confirmation of that.

    With the baseline established it was time to check if I could really see anything above the 1080p mark. Note: it doesn't matter one iota if I can only see 1200p or 1440p instead of the full 4k since TVs of those resolutions aren't commonly for sale. All that matters if I can see an improvement in picture from 1080p to 4k.

    I fired up the test once more. This time, like the last - no real difference for the first minute or so - except, was that snow just then? When we reach the city I start to notice the flashing off her bra and shoulderpad are seriously reduced and instead more detail is apparent. Background scenery is almost indistinguishable from 1080p but it does look like there's more "grain" or "grit" to the picture. When the dragon flies over the tower the vast swathes of screen flicker from 1080p are now gone or almost gone. This is from the fresnel and specular components on noisy surfaces. Bra and shoulder look great, detailed, and no flicker.

    The tree is an area of very high noise and light variation. It does still flicker, but the area of flickering is smaller and the intensity is reduced.

    The same goes for the inside of the cave, which is now beautifully detailed with fine filigree visible from specular reflections. Again, same thing for the dragon. This scene in here will be the easiest place for anyone to see the difference that 4k brings.

    The end result is that I can see the difference between 720p, 1080p and 4k. After 1080p it becomes a lot more subtle and you have to know what you're looking for to see it. It's mostly found in high frequency specular areas, and particularly ones with a high gloss and albedo difference.

    I can watch a high bitrate 720p and be happy with it, and I do for material that I don't mind a little loss of detail. I watch most of my stuff in 1080p, it's good enough for almost anything I want to watch. In many scenes it would be difficult to even see the difference from 1080p to 4k. There's this small amount of content I really want to see every little detail possible and have a truly stunning picture, and that's what I get in 4k. I can see the individual strands in the weave of a dark suit. Every little flyaway hair on the actor's heads is sticking out clearly. Pores are very clear to see, along with any blemishes on the skin. Metals show tiny detail like pit marks. Every little detail is there to see.

    Is 4k over the top and beyond our physical ability to see? You mentioned the limit on the fovea, and while this is true I don't think it's the whole picture. Being able to see an individual pixel isn't the limit, it's how those pixels interact with the ones around them that matters as well. Vision is also not a frame based / pixel based thing, it's continuous, so I think we gain a little more "resolution" from being able to take x number of snapshots of each frame from microscopically different views i.e. we might actually register a pixel shown at 30FPS 10 times in our brain wetware, each ever so slightly differently, and that's used to build the internal picture of the view.

    Anyway, bottom line is I can see the differences in the picture. I believe most could, if they knew what they were looking for, and what was actually missing from the picture they are viewing. 1080p is almost certainly good enough for the majority, but I love watching shows, and it's totally worth it to me to pay a bit more to get the picture quality of my choice.