Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by CoolHand on Saturday September 24 2016, @02:02AM   Printer-friendly
from the hehe-they-said-pissing-hehe dept.

Get ready to endlessly debate the value of "native 4K" on consoles

http://arstechnica.com/gaming/2016/09/microsoft-and-sonys-emerging-4k-pixel-pissing-contest/

Sony's PlayStation 4 Pro (launching in November) and Microsoft's Xbox One Scorpio (launching late next year) are giving the pixel-counters out there a new, 4K-sized battlefield to fight over. Now, Microsoft is drawing a line in the sand in that developing battle, with Microsoft Studios Publishing General Manager Shannon Loftis telling USA Today that "any games we're making that we're launching in the Scorpio time frame, we're making sure they can natively render at 4K."

The word "natively" is important there, because there has been a lot of wiggle room when it comes to talking about what constitutes a truly "4K" game these days. For instance, according to developers Ars has talked to, many if not most games designed for the PS4 Pro will be rendered with an internal framebuffer that's larger than that for a 1080p game, but significantly smaller than the full 3840×2160 pixels on a 4K screen (the exact resolution for any PS4 Pro game will depend largely on how the developer prioritizes the frame rate and the level of detail in the scene). While the PS4 Pro can and does output a full 4K signal, it seems that only games with exceedingly simple graphics will be able to render at that resolution natively.

-- submitted from IRC


Original Submission

Related Stories

Is Screen Resolution Good Enough Considering the Fovea Centralis of the Eye? 66 comments

The top google hits say that there is little or no benefit to resolution above 4k. I recently bought a 40" 4k tv which I use as a monitor (2' viewing distance). While this is right at the threshold where I'm told no benefit can be gained from additional resolution, I can still easily discern individual pixels. I'm still able to see individual pixels until I get to about a 4' viewing distance (but I am nearsighted).

I did some research and according to Wikipedia the Fovea Centralis (center of the eye) has a resolution of 31.5 arc seconds. At this resolution, a 4k monitor would need to be only 16" at a 2' viewing distance, or my 40" would need a 5' viewing distance.

Now the Fovea Centralis comprises only the size of 2 thumbnails width at arms length (2° viewing angle) and the eye's resolution drops off quickly farther from the center. But this tiny portion of the eye is processed by 50% of the visual cortex of the brain.

So I ask, are there any soylentils with perfect vision and/or a super high resolution set up, and does this match where you can no longer discern individual pixels? Do you think retina resolution needs to match the Fovea Centralis or is a lesser value acceptable?

My 40" 4k at 2' fills my entire field of view. I really like it because I have so much screen real estate for multiple windows or large spreadsheets, or I can scoot back a little bit for gaming (so I don't have to turn my head to see everything) and enjoy the higher resolution. I find 4k on high graphics looks much nicer than 1080p on Ultra. I find the upgrade is well worth the $600 I spent for the tv and a graphics card that can run it. Have you upgraded to 4k and do you think it was worth it? I would one day like to have dual 32" 8k monitors (not 3D). What is your dream setup if technology and price weren't an issue?

Written from my work 1366 x 768 monitor.

Related discussions: First "8K" Video Appears on YouTube
LG to Demo an 8K Resolution TV at the Consumer Electronics Show
What is your Video / Monitor Setup?
Microsoft and Sony's Emerging 4K Pissing Contest


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @02:14AM

    by Anonymous Coward on Saturday September 24 2016, @02:14AM (#405819)

    By small number.

    • (Score: 4, Insightful) by edIII on Saturday September 24 2016, @03:35AM

      by edIII (791) on Saturday September 24 2016, @03:35AM (#405837)

      The number I want is zero.

      0 instances of DRM anywhere
      0 binaries or blobs in their devices
      0 issues with installing other operating systems
      0 advertising networks
      0 issues with making backup copies of anything with it
      0 instances of telemetry

      and finally,

      0 Sony executives escaping justice for infecting us with a root kit and surreptitiously violating our privacy and peaceful enjoyment of property.

      --
      Technically, lunchtime is at any moment. It's just a wave function.
  • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @02:16AM

    by Anonymous Coward on Saturday September 24 2016, @02:16AM (#405822)

     

    • (Score: 3, Insightful) by Scruffy Beard 2 on Saturday September 24 2016, @02:24AM

      by Scruffy Beard 2 (6030) on Saturday September 24 2016, @02:24AM (#405823)

      I remember back in the day, one of the supposed advantages of 3D rendering was resolution independence.

      Sure textures get blurry when you get close, but that is what interpolation is for.

      • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @02:34AM

        by Anonymous Coward on Saturday September 24 2016, @02:34AM (#405826)

        I remember when 3D meant 3D rendering instead of 3D printing or stereoscopic 3D.

        • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @11:30AM

          by Anonymous Coward on Saturday September 24 2016, @11:30AM (#405906)

          No love for POV-ray [povray.org]?

      • (Score: 1) by Francis on Saturday September 24 2016, @03:19AM

        by Francis (5544) on Saturday September 24 2016, @03:19AM (#405831)

        Textures got blurry mainly because they had limited amount of space for textures. The other issue was the limited number of vertices making it impossible to create proper curves. I haven't noticed that much with recent games, I think by sometime in the noughties it wasn't much of an issue.

        But, then again, that was about the time when monitors temporarily stagnated in terms of resolution, which gave the games and hardware developers a chance to catch up a bit and anti-aliasing has gotten a lot better. Personally, the anti-aliasing is a bigger deal for me than the lack of 4k resolution. Even a few years ago when FO:NV was new, the resolution was more than enough, but the aliasing that would go on was somewhat distracting.

        • (Score: 2) by mhajicek on Saturday September 24 2016, @07:14AM

          by mhajicek (51) on Saturday September 24 2016, @07:14AM (#405873)

          Hmm. CADCAM software uses parametric surfaces and solids rather than triangle meshes, which allows for as much precision as you'd care to calculate. Will games eventually go down that path too?

          --
          The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
      • (Score: 2) by ledow on Saturday September 24 2016, @04:35PM

        by ledow (5567) on Saturday September 24 2016, @04:35PM (#405970) Homepage

        Isn't that what texture mipmapping solved?

        A range of textures depending on the proximity of the viewer to the texture?

        This stuff isn't limited by anything other than the processing power, and you can render any resolution you like. But without the back-end video memory to hold all the textures (including the mipmapped ones), and the processing to draw at that resolution (which quadruples every time you double the resolution, for example), you are constantly playing catchup.

        How about an option to let people put it in the resolution they want, and get the relevant FPS on their hardware.

        Then I'd do some metrics and collect HDMI device details, etc. - and I bet that a lot of those that select 4K don't have the hardware but "think" it makes a difference, and those that have 4K turn it down to get better FPS.

        • (Score: 2) by Pino P on Saturday September 24 2016, @10:15PM

          by Pino P (4721) on Saturday September 24 2016, @10:15PM (#406055) Journal

          Mipmapping isn't enough when the camera comes so close to a surface that even the largest texture in a set gets stretched. For example, every console since the Nintendo 64 has supported mipmapping in some form, but Goldeneye 007 for N64 still gets blurry as fcuk when facing a wall because the largest texture in a set couldn't be bigger than 64x64. That's more an issue of memory size, memory bandwidth (both within RAM and from permanent storage), and artist effort.

    • (Score: 4, Insightful) by physicsmajor on Saturday September 24 2016, @03:34AM

      by physicsmajor (1471) on Saturday September 24 2016, @03:34AM (#405836)

      So, you're joking, but you know what? There is a limit. And while it's above 360p, we've passed it.

      If you sit a normal distance from your television (i.e., at least 6 feet), and your screen is smaller than 50 inches or so, you cannot perceive anything more than 720p on that monitor. It's entirely wasted. If you stick your nose in it, you can, but not in a living room. Doesn't matter how good your vision is, that's the limit of resolution in the fovea.

      Computer monitors at a desk can perhaps make use of 2500x1400 or so. Once we start talking 4K, it's just stupid stats to waste power and tank your framerates to try and drive all those pixels you can't see anyhow. The one and only use of actual, displayed 4K resolution I could see is VR. It sure as hell isn't useful in your living room. That said, essentially the best antialiasing method in existence is rendering at a higher resolution and downsampling. There is truth to that, but it doesn't seem to be what this pissing contest is about.

      Don't even get me started on phone screen resolutions. If you are above 720p there, you are wasting your battery on a useless statistic for any smartphone use case.

      • (Score: 2) by blackhawk on Saturday September 24 2016, @06:45AM

        by blackhawk (5275) on Saturday September 24 2016, @06:45AM (#405869)

        I think you might be underestimating the figures a little. I sit about 7 feet away from a 55 inch TV which is a mid-range Sony 4k set.

        Watching content at various resolution for me is more like:

        * 480p - it's an awful blurry mess that my TVs smart scaling works it's hardest to make watchable
        * 576p - blurry and lacking definition
        * 720p - the absolute minimum resolution i can stand to watch, at this point it's still somewhat blurry and is lacking a good deal of the detail I crave
        * 1080p - the picture now looks pretty decent, and on high bitrate high dynamic range sources looks really quite good - i enjoy watching at this level of detail
        * 4K - I can see extra definition in the picture over 1080p, colours pop (HDR), grain is reduced, detail level is excellent

        At 4k you might be right, it's more than I need, but what I need and can perceive is above the 1080p and far far beyond the crappy 720p you think I can see. I can easily see the difference between 4k and 1080p content.

        The changes become more subtle above 1080p but are present. I probably don't need the entire 4k, but i'd rather have some unused bandwidth than sit there looking at a picture I know could be much better.

        • (Score: 2) by physicsmajor on Saturday September 24 2016, @01:39PM

          by physicsmajor (1471) on Saturday September 24 2016, @01:39PM (#405932)

          I strongly suspect that the reason for this is that TV manufacturers have no incentive to put good scaling chips in their monitors. They want you to believe that the best is the highest resolution, so deliberately cheap out (or sabotage in software) how lower resolution signals are upscaled and displayed.

          Try this sometime:

          - Connect a computer to one of your auxiliary inputs.
          - Download MPC-HC.
          - Play back a lower resolution recording (from a free Blender movie or otherwise) with the computer doing the upscaling, sending a native 4K native signal to the TV

          You're on the physics fence between being able to tell the difference between 720p and 1080p with that system. However, I think with that test you'll find no difference whatsoever between 1080p and 4K. I sit 3-4 feet from 22 inch monitors and there is no perceptible difference between 720p and 1080p.

          • (Score: 2) by blackhawk on Saturday September 24 2016, @02:42PM

            by blackhawk (5275) on Saturday September 24 2016, @02:42PM (#405940)

            I have a mid range Sony Bravia set which comes with a pretty impressive scaler. It's not one of the cheap brand things you get from Best Buy. I'll look into running that test, since I have my PC hooked up via a nice long HDMI cable, it should be easy enough to do.

            For me, what I'm mostly noticing 720P upwards is the lack of grain / noise / or a kind of high passed look to the output. It's softer, less gritty.

            BTW, the upstairs TV is a cheaper model (50 inch) that I believe only has 720p physical resolution but takes in 1080p signals. Looking at that screen from a foot or two away I can clearly see the individual pixels, and even at about 10 feet it is clearly a much worse picture when displaying the Kodi interface and during playback of media. The fonts on the Kodi interface appear smeared and blurry.

            My eyesight has been tested recently and it's close to perfect. Sitting 2 feet back from my 24 inch 1080p monitor I can make out the single pixel dot above the 'i' character quite clearly.

          • (Score: 2) by Scruffy Beard 2 on Saturday September 24 2016, @03:41PM

            by Scruffy Beard 2 (6030) on Saturday September 24 2016, @03:41PM (#405958)

            I suspect that 4k gives you higher dynamic range, not extra resolution.

            It is essentially the equivalent to dithering,

            Though 1080p on a 42" TV is definitely blurry at 1-2 feet (the viewing distance my brother uses due to vision problems).

            • (Score: 2) by blackhawk on Saturday September 24 2016, @05:48PM

              by blackhawk (5275) on Saturday September 24 2016, @05:48PM (#405992)

              The 4k Sony Bravia I am watching on does have high dynamic range and I can't recommend that enough. Anyone should be able to pick out the better saturation of colours and the wider range from black to white - particularly in dark scenes. I've watched a few series now with 1080p 10bit h265 encodes and they look glorious compared to previous versions. This is a combination of both the improved colour gamut and improvements in bitrate from the H265 algorithm (assuming the encoder selected a decent rate and didn't just try and crush a 45min show down to 230MB files - which does work pretty well oddly enough).

              Viewing distance and screen size and resolution all have to be taken into account at once when talking about this subject though. None of those figures alone or in conjunction with 1 other gives the true picture - and that is pixels / degree i.e. the number of pixles displayed / degree of viewing from the subject's eye.

              I actually sit a lot closer to my TV than most people, and a 55 inch screen is decently large, so I have no problem spotting the difference between 720p and 1080. Above 1080p it's harder, a lot harder and it comes more as a qualitative thing. It's not individual pixels you are seeing, but how they react as a whole.

              Textures become more defined, in particular hair and fur. Fresnel lighting which tends to flicker outrageously on lower resolutions will simmer down or be removed entirely. You are getting the benefits you'd normally expect from anti-aliasing, but without any of the downsides because the extra resolution is actually there. I can see pore detail on skin better, and individual flyaway hairs on people's hair stand out well at 1080p and above. They are present at 720p but less distinct.

              You will notice the extra detail when the camera pans, as flicker that would be present in 1080p and lower content is vastly reduced at 4k or not present at all.

              I work in game development, so looking for that extra detail is pretty much in my job description. I know other people who can't see the difference, but I can, so it's totally worth the extra cost for me.

          • (Score: 2) by blackhawk on Sunday September 25 2016, @06:03AM

            by blackhawk (5275) on Sunday September 25 2016, @06:03AM (#406180)

            So I ran the test requested by physicsmajor and can offer you my empirical results.

            Setup:

            * The TV set is a Sony Bravia 55 inch
            * I'm a late forties aged programmer who works with games and media
            * My head is about 6.5 - 7.0 feet from the TV set
            * Media tested was Sintel from the Blender Project H264 versions at 720p, 1080p and 4k
            * I used an HDMI 4k / 3D capable cable from my PC to the TV
            * all tests ran native at 4K on the PC
            * VLC decoder was used in fullscreen mode (MPC was being a twat about resizing the playback image to the window size)

            The test media is provided at quite a decent bitrate - higher than is normal for downloaded media, so much of the image quality issues I dislike at 720p are no longer present. While a 720p scene release tends to be a bit "smeary" and of course a little flat in the colour range, this wasn't an issue with the Sintel media.

            At 720p the picture looked fine. It was quite watchable and unless you were looking for it you might not realise you weren't seeing a 1080p picture. But this fineness came at the cost of the high frequency detail, which was less present in the image. The opening minute was almost the same no matter the resolution because it's a snow / black rock scene with almost no high frequency detail. Oddly, when I watched it in 4k I thought I saw it was snowing for the first time. I'll have to go back and check the others now to see if it really was snowing.

            The details I was looking for in particular are:

            * the cloth in the main character's bra-top
            * specular highlights off the walls in the city, the large tree, the walls of the cave, the end dragon, etc
            * increased detail in areas where specular highlights typically occur

            In the 720p presentation, a lot of the areas that show as "glinting" or "flashing" in 1080p are much softer and appear more as a grey than a harsh flashing between deep greys and bright whites. The areas being most affected are those with high frequency details that is specular in nature. Overall, the film was watchable and you might not feel like you are missing any detail, especially if you didn't know it was there to begin with.

            Once I watched the 1080p presentation I saw those areas mentioned appear as textured zones and they started to noticeably flash as the fresnel changed (angle at which you are viewing them combined with a sharp incidence on reflected light). I noticed some areas now appear more detailed. Some parts of the leather armour from the fight scene have just slightly better looking texture, but it's very subtle. Her shoulderpad now shows glinting as it shifts in the light and is quite noisy, as is her top.

            I went back and viewed both the 720p and 1080p again to confirm I had correctly identified differences in the picture. Having done so I can confirm I can definitely see them differences and it wasn't that hard to tell which was which. I can usually tell when I am watching 720p rather than 1080p on the shows I watch, so this is just further confirmation of that.

            With the baseline established it was time to check if I could really see anything above the 1080p mark. Note: it doesn't matter one iota if I can only see 1200p or 1440p instead of the full 4k since TVs of those resolutions aren't commonly for sale. All that matters if I can see an improvement in picture from 1080p to 4k.

            I fired up the test once more. This time, like the last - no real difference for the first minute or so - except, was that snow just then? When we reach the city I start to notice the flashing off her bra and shoulderpad are seriously reduced and instead more detail is apparent. Background scenery is almost indistinguishable from 1080p but it does look like there's more "grain" or "grit" to the picture. When the dragon flies over the tower the vast swathes of screen flicker from 1080p are now gone or almost gone. This is from the fresnel and specular components on noisy surfaces. Bra and shoulder look great, detailed, and no flicker.

            The tree is an area of very high noise and light variation. It does still flicker, but the area of flickering is smaller and the intensity is reduced.

            The same goes for the inside of the cave, which is now beautifully detailed with fine filigree visible from specular reflections. Again, same thing for the dragon. This scene in here will be the easiest place for anyone to see the difference that 4k brings.

            The end result is that I can see the difference between 720p, 1080p and 4k. After 1080p it becomes a lot more subtle and you have to know what you're looking for to see it. It's mostly found in high frequency specular areas, and particularly ones with a high gloss and albedo difference.

            I can watch a high bitrate 720p and be happy with it, and I do for material that I don't mind a little loss of detail. I watch most of my stuff in 1080p, it's good enough for almost anything I want to watch. In many scenes it would be difficult to even see the difference from 1080p to 4k. There's this small amount of content I really want to see every little detail possible and have a truly stunning picture, and that's what I get in 4k. I can see the individual strands in the weave of a dark suit. Every little flyaway hair on the actor's heads is sticking out clearly. Pores are very clear to see, along with any blemishes on the skin. Metals show tiny detail like pit marks. Every little detail is there to see.

            Is 4k over the top and beyond our physical ability to see? You mentioned the limit on the fovea, and while this is true I don't think it's the whole picture. Being able to see an individual pixel isn't the limit, it's how those pixels interact with the ones around them that matters as well. Vision is also not a frame based / pixel based thing, it's continuous, so I think we gain a little more "resolution" from being able to take x number of snapshots of each frame from microscopically different views i.e. we might actually register a pixel shown at 30FPS 10 times in our brain wetware, each ever so slightly differently, and that's used to build the internal picture of the view.

            Anyway, bottom line is I can see the differences in the picture. I believe most could, if they knew what they were looking for, and what was actually missing from the picture they are viewing. 1080p is almost certainly good enough for the majority, but I love watching shows, and it's totally worth it to me to pay a bit more to get the picture quality of my choice.

        • (Score: 2) by ledow on Saturday September 24 2016, @04:46PM

          by ledow (5567) on Saturday September 24 2016, @04:46PM (#405972) Homepage

          I was absolutely IN AWE of the resolution of my 1024x768 monitor when I put a WinTV card in it and watched bog-standard, pre-HD analog video (which would have been PAL in my area) on it.

          It was crisp, sharp, amazingly clear, even when "scaled".

          My eyesight hasn't improved in the 20 years since then, so I see no need for even HD.

          But it was ALWAYS a different matter for TV. No matter how many gadgets I had to put the signal into the TV, no amount of VGA or component video convertors to standard TV resolutions would work nicely for using them until we started getting HD TV's.

          The reason I have a HD laptop is for the word processing, the web-browsing and the text. The TV is now capable of taking that signal and reproducing it (because it's nothing more than a large LCD monitor nowadays).

          Beyond that? You're just pissing away processing power. I buy all my online content in SD and never notice. For moving scenes, I honestly can't spot a difference. And nor can anybody who comes to my house and tries to do so fairly (my TV is pretty non-descript but if you Google the model number or stick your face so close you can see RGB elements, that's just cheating).

          At any sensible distance, for any sensible screen size (32" for my TV, 17" for my laptop), SD is more than good enough for most people. HD collects all the outliers who can see things that others can't or want to convince themselves.

          4K is just an horrendous waste of money. Especially if - when you play it on a HD screen - it isn't optimised enough to get you full FPS. I guarantee you that some games will be "4K only" and you won't be able to dial them down.

          There was a point where analog->digital, VGA->HDMI, SD->HD, 25fps->60fps, etc. made a difference that you could see. Those days are long-gone.

          I projected a VGA image to a 200" projected screen recently. Nobody even noticed or cared. And they were doing an art project with a very particular artist who was fussing about making it look exactly right. Nobody asked for higher resolutions, or HDMI or anything, even though they were available. If you can't spot it or care on a 200" screen for a recorded art performance, you aren't going to care about 4K vs HD on your 32" bedroom screen for the kids to play games on, when they're sitting feet away from it.

          If you can truly spot the difference at a normal viewing distance on ALL hardware (Sony and big-end things tend to be WORSE for this, I've compared them to old CRTs and even cheap Samsung junk and the cheap stuff does Blu-Ray, DVD and even upscaling better - Sony really want you to think your wasted money was worthwhile), then I pity you. Your digital media life is always going to be expensive and suck for you.

          But I would pity you even more if I thought that - 20 years ago - you never once complained about the resolution still being bad even after you went from SD->HD or analog->digital, etc. because, actually, you can't tell and are just trying to justify your expensive and unnecessary purchases.

          • (Score: 2) by blackhawk on Saturday September 24 2016, @06:01PM

            by blackhawk (5275) on Saturday September 24 2016, @06:01PM (#405999)

            I'm sitting a little over 6 feet from a 55 inch TV, so my viewing experience isn't the same as yours sitting god knows how far from a 32 inch screen. The screen is quite probably larger in my field of view than you are used to.

            I'm working on writing a game now, so I have a few opinions on 4k gaming, and that's mainly that it isn't going to be worth a damn for 4+ years. We need to wait a while for the high end graphics cards to catch up to what the monitors and TVs can output. It puts a tremendous strain on the CPU / GPU and even memory architecture to push all that data out the HDMI connector. If you can spend $1000USD on a video card or get a pair of 980s then you can play in 4k now, but for most it is out of reach. I'd rather hit 60+ FPS at a stable rate than worry about 4K for gaming. And before you ask, yes I can damn well see when a game is running above 30FPS, it's jittery and janky as hell anything below about 55FPS for me. 60FPS is usually fine, but for VR I don't feel quite right unless it's 90FPS.

            I honestly wouldn't have bought a 4K TV at this time. I had a perfectly good Sony Bravia which was 9 years old and had a great picture at 1080p. It eventually burnt out the capacitor on the backlight (I believe) and would no longer show anything more than a blank screen when starting up.

            Given I had to replace the TV the choice came down to saving a few bucks now and getting a 1080p, or future proofing and getting a TV I will enjoy for the next 5+ years. The 4K TV has greater colour gamut, improved motion handling, improved black levels and a better picture in so many ways over the cheaper sets - it wasn't hard to decide to drop the money and get a TV I would still want to use several years from now when 4k content starts to come out.

      • (Score: 1, Insightful) by Anonymous Coward on Saturday September 24 2016, @05:06PM

        by Anonymous Coward on Saturday September 24 2016, @05:06PM (#405977)

        4k is for programmers. [tsotech.com] I use a 4k, 39" TV as a computer monitor at 2-3 ft distance. With the higher resolution, new usage patterns become possible.

  • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @03:21AM

    by Anonymous Coward on Saturday September 24 2016, @03:21AM (#405833)

    I know you soylent fuckers don't hold yourselves to the highest of standards, but c'mon? Have we really delved so deep into the shit-abyss that this is the kind of fetid drivel you use for a headline?

    • (Score: 4, Funny) by edIII on Saturday September 24 2016, @03:31AM

      by edIII (791) on Saturday September 24 2016, @03:31AM (#405834)

      Slow down there, buddy.

      A pissing contest, or pissing match, is a game in which participants compete to see who can urinate the highest, the farthest, or the most accurately. Although the practice is often associated with adolescent boys, women have been known to play the game, and there are literary depictions of adults competing in it.

      I've been around some time, and pissing contest was never an intrinsically negative thing. It was a sport, and I won plenty of times by just pissing off a bridge or overpass, which still counts for distance (Yes, it does Chuck).

      If anything I thought the title was "playful". I guess, because it's a game. I'm guessing it's turned negative huh?

      Well in any case, the guys here didn't choose it. Arstechnica did [arstechnica.com] :)

      --
      Technically, lunchtime is at any moment. It's just a wave function.
      • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @04:15AM

        by Anonymous Coward on Saturday September 24 2016, @04:15AM (#405846)

        Sure there friend, a pissing match might not be a negative thing, but surely it is a vulgar one eh? Is this really tha sorta thing you folks want the kids to see?

        • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @04:23AM

          by Anonymous Coward on Saturday September 24 2016, @04:23AM (#405849)

          > Is this really tha sorta thing you folks want the kids to see?

          Yes.
          If you don't want kids to see it, then get your own kids.

        • (Score: 2) by edIII on Saturday September 24 2016, @04:40AM

          by edIII (791) on Saturday September 24 2016, @04:40AM (#405850)

          I would at most say it is juvenile. Of course it can be vulgar because it's related to urination, but that is generally reserved for something offensive. The idea of some boys playing out in the woods and deciding who can pee the farthest doesn't offend me. I accept that as part of childhood, and natural development. I'm sure that it is nearly universal behavior.

          Getting beyond the technically vulgar nature of it, it's not unreasonable to assume a benign sporting contest. It may even be rivals, but that doesn't automatically imply a negative and caustic relationship. Again, if you see a pissing contest between boys as something rather benign and innocent, it shouldn't surprise you that I interpreted it the same way.

          God forbid, you ever have a friendly rivalry where you can shake hands afterwards huh?

          --
          Technically, lunchtime is at any moment. It's just a wave function.
    • (Score: 1, Touché) by Anonymous Coward on Saturday September 24 2016, @05:30AM

      by Anonymous Coward on Saturday September 24 2016, @05:30AM (#405858)
      Please. The phrase has become so widely used since at least the 1940s it's practically become an idiom for a pointless, childish, ego-driven competition, which is exactly what this squabble between Microsoft and Sony seems to be.
  • (Score: 0) by Anonymous Coward on Saturday September 24 2016, @12:47PM

    by Anonymous Coward on Saturday September 24 2016, @12:47PM (#405919)

    It fascinated me for minutes. Can't wait to see it in glorious 3.84K resolution.