Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Saturday September 24 2016, @02:02AM   Printer-friendly
from the hehe-they-said-pissing-hehe dept.

Get ready to endlessly debate the value of "native 4K" on consoles

http://arstechnica.com/gaming/2016/09/microsoft-and-sonys-emerging-4k-pixel-pissing-contest/

Sony's PlayStation 4 Pro (launching in November) and Microsoft's Xbox One Scorpio (launching late next year) are giving the pixel-counters out there a new, 4K-sized battlefield to fight over. Now, Microsoft is drawing a line in the sand in that developing battle, with Microsoft Studios Publishing General Manager Shannon Loftis telling USA Today that "any games we're making that we're launching in the Scorpio time frame, we're making sure they can natively render at 4K."

The word "natively" is important there, because there has been a lot of wiggle room when it comes to talking about what constitutes a truly "4K" game these days. For instance, according to developers Ars has talked to, many if not most games designed for the PS4 Pro will be rendered with an internal framebuffer that's larger than that for a 1080p game, but significantly smaller than the full 3840×2160 pixels on a 4K screen (the exact resolution for any PS4 Pro game will depend largely on how the developer prioritizes the frame rate and the level of detail in the scene). While the PS4 Pro can and does output a full 4K signal, it seems that only games with exceedingly simple graphics will be able to render at that resolution natively.

-- submitted from IRC


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by ledow on Saturday September 24 2016, @04:46PM

    by ledow (5567) on Saturday September 24 2016, @04:46PM (#405972) Homepage

    I was absolutely IN AWE of the resolution of my 1024x768 monitor when I put a WinTV card in it and watched bog-standard, pre-HD analog video (which would have been PAL in my area) on it.

    It was crisp, sharp, amazingly clear, even when "scaled".

    My eyesight hasn't improved in the 20 years since then, so I see no need for even HD.

    But it was ALWAYS a different matter for TV. No matter how many gadgets I had to put the signal into the TV, no amount of VGA or component video convertors to standard TV resolutions would work nicely for using them until we started getting HD TV's.

    The reason I have a HD laptop is for the word processing, the web-browsing and the text. The TV is now capable of taking that signal and reproducing it (because it's nothing more than a large LCD monitor nowadays).

    Beyond that? You're just pissing away processing power. I buy all my online content in SD and never notice. For moving scenes, I honestly can't spot a difference. And nor can anybody who comes to my house and tries to do so fairly (my TV is pretty non-descript but if you Google the model number or stick your face so close you can see RGB elements, that's just cheating).

    At any sensible distance, for any sensible screen size (32" for my TV, 17" for my laptop), SD is more than good enough for most people. HD collects all the outliers who can see things that others can't or want to convince themselves.

    4K is just an horrendous waste of money. Especially if - when you play it on a HD screen - it isn't optimised enough to get you full FPS. I guarantee you that some games will be "4K only" and you won't be able to dial them down.

    There was a point where analog->digital, VGA->HDMI, SD->HD, 25fps->60fps, etc. made a difference that you could see. Those days are long-gone.

    I projected a VGA image to a 200" projected screen recently. Nobody even noticed or cared. And they were doing an art project with a very particular artist who was fussing about making it look exactly right. Nobody asked for higher resolutions, or HDMI or anything, even though they were available. If you can't spot it or care on a 200" screen for a recorded art performance, you aren't going to care about 4K vs HD on your 32" bedroom screen for the kids to play games on, when they're sitting feet away from it.

    If you can truly spot the difference at a normal viewing distance on ALL hardware (Sony and big-end things tend to be WORSE for this, I've compared them to old CRTs and even cheap Samsung junk and the cheap stuff does Blu-Ray, DVD and even upscaling better - Sony really want you to think your wasted money was worthwhile), then I pity you. Your digital media life is always going to be expensive and suck for you.

    But I would pity you even more if I thought that - 20 years ago - you never once complained about the resolution still being bad even after you went from SD->HD or analog->digital, etc. because, actually, you can't tell and are just trying to justify your expensive and unnecessary purchases.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by blackhawk on Saturday September 24 2016, @06:01PM

    by blackhawk (5275) on Saturday September 24 2016, @06:01PM (#405999)

    I'm sitting a little over 6 feet from a 55 inch TV, so my viewing experience isn't the same as yours sitting god knows how far from a 32 inch screen. The screen is quite probably larger in my field of view than you are used to.

    I'm working on writing a game now, so I have a few opinions on 4k gaming, and that's mainly that it isn't going to be worth a damn for 4+ years. We need to wait a while for the high end graphics cards to catch up to what the monitors and TVs can output. It puts a tremendous strain on the CPU / GPU and even memory architecture to push all that data out the HDMI connector. If you can spend $1000USD on a video card or get a pair of 980s then you can play in 4k now, but for most it is out of reach. I'd rather hit 60+ FPS at a stable rate than worry about 4K for gaming. And before you ask, yes I can damn well see when a game is running above 30FPS, it's jittery and janky as hell anything below about 55FPS for me. 60FPS is usually fine, but for VR I don't feel quite right unless it's 90FPS.

    I honestly wouldn't have bought a 4K TV at this time. I had a perfectly good Sony Bravia which was 9 years old and had a great picture at 1080p. It eventually burnt out the capacitor on the backlight (I believe) and would no longer show anything more than a blank screen when starting up.

    Given I had to replace the TV the choice came down to saving a few bucks now and getting a 1080p, or future proofing and getting a TV I will enjoy for the next 5+ years. The 4K TV has greater colour gamut, improved motion handling, improved black levels and a better picture in so many ways over the cheaper sets - it wasn't hard to decide to drop the money and get a TV I would still want to use several years from now when 4k content starts to come out.