Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Wednesday January 11 2017, @08:08AM   Printer-friendly
from the surgically-grafted-to-the-inside-of-the-eyelids dept.

The top google hits say that there is little or no benefit to resolution above 4k. I recently bought a 40" 4k tv which I use as a monitor (2' viewing distance). While this is right at the threshold where I'm told no benefit can be gained from additional resolution, I can still easily discern individual pixels. I'm still able to see individual pixels until I get to about a 4' viewing distance (but I am nearsighted).

I did some research and according to Wikipedia the Fovea Centralis (center of the eye) has a resolution of 31.5 arc seconds. At this resolution, a 4k monitor would need to be only 16" at a 2' viewing distance, or my 40" would need a 5' viewing distance.

Now the Fovea Centralis comprises only the size of 2 thumbnails width at arms length (2° viewing angle) and the eye's resolution drops off quickly farther from the center. But this tiny portion of the eye is processed by 50% of the visual cortex of the brain.

So I ask, are there any soylentils with perfect vision and/or a super high resolution set up, and does this match where you can no longer discern individual pixels? Do you think retina resolution needs to match the Fovea Centralis or is a lesser value acceptable?

My 40" 4k at 2' fills my entire field of view. I really like it because I have so much screen real estate for multiple windows or large spreadsheets, or I can scoot back a little bit for gaming (so I don't have to turn my head to see everything) and enjoy the higher resolution. I find 4k on high graphics looks much nicer than 1080p on Ultra. I find the upgrade is well worth the $600 I spent for the tv and a graphics card that can run it. Have you upgraded to 4k and do you think it was worth it? I would one day like to have dual 32" 8k monitors (not 3D). What is your dream setup if technology and price weren't an issue?

Written from my work 1366 x 768 monitor.

Related discussions: First "8K" Video Appears on YouTube
LG to Demo an 8K Resolution TV at the Consumer Electronics Show
What is your Video / Monitor Setup?
Microsoft and Sony's Emerging 4K Pissing Contest


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Scruffy Beard 2 on Wednesday January 11 2017, @08:34AM

    by Scruffy Beard 2 (6030) on Wednesday January 11 2017, @08:34AM (#452421)

    I for one am glad the computer monitors are no longer apparently limited to 1920x1080x24bppx60Hz (I blame HDMI and DRM for that)

    Of course, I will be using 1200x1024x24bpp monitors for the foreseeable future. One I can push above 60Hz. (Two are technically not limited to 24bpp, but pretty sure the video card is.)

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Wednesday January 11 2017, @04:24PM

    by Anonymous Coward on Wednesday January 11 2017, @04:24PM (#452545)

    I've never owned a 1920x1080 monitor and am not sure of the limit you speak of.

    I went from 640x480 to 800x600 to 1024x760 to 1280x1024 to 1920x1200 and then regressed to to 1680x1050, then to 2560x1440 and finally to whatever 4k is that isn't really 4k.

    Only the cool kids I thought were using TVs for their computers, like what I did back when I had a commodore 64.

    Even nvidias card naming after HD Television I thought was just a gimmick to help sell cheap shit at high prices? true 1920x1200 monitors nowadays are not inexpensive. It's like the 16:10 went away with the economy in 2008. Everything got cheap but that didn't make any of it better.

    All the reviews about stuff on 1080 on "ultra" Well no shit, that resolution isn't even as good as what we were trying to get beyond even 10 years ago. I hope it runs well because if it ran worse then why are we paying for this garbage. The NVidia new cards are impressive, yes, but to market them after something that sucks... I guess that is what happens now that my hobby is now mainstream. I thought the "1080" was a low end card until I was informed of my mistake. I guess no one will make a 1200 or 1440 because they sound like modem bit rates for geeks.

    • (Score: 2) by vux984 on Wednesday January 11 2017, @09:29PM

      by vux984 (5045) on Wednesday January 11 2017, @09:29PM (#452713)

      I thought the "1080" was a low end card until I was informed of my mistake.

      Like the $800 price tag didn't give it away? Or that fact that anyone who knew anything about nvidia knew that the GTX 1080 was just a next generation 980 which was a successor to the 780 which was several generations newer 280...

      If the GTX 1080 was 'named after HD TV' then it was one of the longest marketing buildups in history to get there, all be undone by next years GTX 1180.

      Wait... I'm being trolled right?