Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Wednesday January 11 2017, @08:08AM   Printer-friendly
from the surgically-grafted-to-the-inside-of-the-eyelids dept.

The top google hits say that there is little or no benefit to resolution above 4k. I recently bought a 40" 4k tv which I use as a monitor (2' viewing distance). While this is right at the threshold where I'm told no benefit can be gained from additional resolution, I can still easily discern individual pixels. I'm still able to see individual pixels until I get to about a 4' viewing distance (but I am nearsighted).

I did some research and according to Wikipedia the Fovea Centralis (center of the eye) has a resolution of 31.5 arc seconds. At this resolution, a 4k monitor would need to be only 16" at a 2' viewing distance, or my 40" would need a 5' viewing distance.

Now the Fovea Centralis comprises only the size of 2 thumbnails width at arms length (2° viewing angle) and the eye's resolution drops off quickly farther from the center. But this tiny portion of the eye is processed by 50% of the visual cortex of the brain.

So I ask, are there any soylentils with perfect vision and/or a super high resolution set up, and does this match where you can no longer discern individual pixels? Do you think retina resolution needs to match the Fovea Centralis or is a lesser value acceptable?

My 40" 4k at 2' fills my entire field of view. I really like it because I have so much screen real estate for multiple windows or large spreadsheets, or I can scoot back a little bit for gaming (so I don't have to turn my head to see everything) and enjoy the higher resolution. I find 4k on high graphics looks much nicer than 1080p on Ultra. I find the upgrade is well worth the $600 I spent for the tv and a graphics card that can run it. Have you upgraded to 4k and do you think it was worth it? I would one day like to have dual 32" 8k monitors (not 3D). What is your dream setup if technology and price weren't an issue?

Written from my work 1366 x 768 monitor.

Related discussions: First "8K" Video Appears on YouTube
LG to Demo an 8K Resolution TV at the Consumer Electronics Show
What is your Video / Monitor Setup?
Microsoft and Sony's Emerging 4K Pissing Contest


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by acid andy on Wednesday January 11 2017, @01:47PM

    by acid andy (1683) on Wednesday January 11 2017, @01:47PM (#452486) Homepage Journal

    I recently bought a 40" 4k tv which I use as a monitor (2' viewing distance).

    I'm occasionally tempted at considering a similar setup myself. There used to be a very clear physical difference between a computer monitor and a television - for a long while televisions remained analog but even after digital switchover monitors had the edge for a while and typically had different connectors up to DVI.

    CRT monitors supported a wide range of resolutions where CRT televisions didn't. Now every LCD has only one native resolution (scaling doesn't count) and monitors are more often than not connected with HDMI just the same as TVs.

    So is there any practical reason now for choosing a device marketed as a "monitor" rather than a "television"? Are the only remaining differences price and marketing spin (and maybe the presence or absence of a DVB decoder)? I struggle to see any difference myself. Also, why can I buy a large OLED TV but not a large OLED monitor?

    --
    If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by VLM on Wednesday January 11 2017, @02:03PM

    by VLM (445) on Wednesday January 11 2017, @02:03PM (#452491)

    My MiL found it impossible to buy a TV that wasn't "smart" so it takes like 30 seconds to boot up or switch inputs or do much of anything. Smart TV UIs always suck. Thankfully it doesn't nag her about her lack of internet access.

    I would imagine there are interesting differences in standards for stuck pixels. In the oldest days of LCD displays there certainly were.

    When I set up my TV in my living room the fad at the time was to provide the user with an infinite number of video processing features that mainly degrade the experience, eventually I figured out that "game" mode was pretty much pass thru unmolested but it took some work. Monitors don't have that anti-feature, do they?

    They seem to put a lot of effort (for a widely varying result) into the audio for TVs.

  • (Score: 1, Informative) by Anonymous Coward on Wednesday January 11 2017, @03:34PM

    by Anonymous Coward on Wednesday January 11 2017, @03:34PM (#452528)

    OP here, sorry I don't have an account.

    The TV I bought is "smart". But is has a gaming mode which is what I use. I do have a 20" 1080p monitor on the side (for when I'm using full screen applications) that has my shortcuts and task bar, and I don't use often but is convenient to always have uncovered. Anyways, they perform pretty much identical. So yes, IMHO you can use a tv as a monitor (or vice versa). I priced monitors when I bought but they were more expensive for larger sizes (and still don't go as big as tvs).

    The 40" 4k is basically four 1080p monitors stacked in a square without a bezel.

  • (Score: 0) by Anonymous Coward on Wednesday January 11 2017, @04:26PM

    by Anonymous Coward on Wednesday January 11 2017, @04:26PM (#452546)

    when did hdmi connectors on PCs become normal? I had to struggle to find something to connect a raspberry pi i bought tom because I dont have any hdmi anything.

    • (Score: 2) by Scruffy Beard 2 on Wednesday January 11 2017, @09:49PM

      by Scruffy Beard 2 (6030) on Wednesday January 11 2017, @09:49PM (#452719)

      About 2006 with the release of Windows Vista and the Protected Media Path.

      Me Bitter?

  • (Score: 2, Informative) by Anonymous Coward on Thursday January 12 2017, @02:13AM

    by Anonymous Coward on Thursday January 12 2017, @02:13AM (#452815)

    Differences between TVs and Monitors include how quickly it refreshes depending on the input port (e.g. is there a 100ms response time to display input data with a noticeable lag for tracking a mouse but not noticeable when watching a continuous movie?), multiple available ports (e.g. displayport?), and ergonomic position adjustments on the stand. There are also "smart" TV negatives like typically long bootup times. So, it depends on what your priorities, the specifics of the TV (some can turn off filtering that causes lag), and your finances.

  • (Score: 2, Informative) by toddestan on Thursday January 12 2017, @04:45AM

    by toddestan (4982) on Thursday January 12 2017, @04:45AM (#452842)

    Monitors for the most part just display what's input into them. A lot of TVs muck around and process the incoming signal, and while it may be OK-ish for video sources, it can result in a very noticeably worse picture for things you use a computer for. Plus some TVs like to overscan or rescale the input, for reasons I don't really understand. It can be a bit of a crapshoot whether or not you can turn this stuff off. Some TVs will disable it, but only for some inputs such as the "PC" input, aka analog VGA but not the HDMI inputs. I wouldn't buy a TV for a monitor without researching it very carefully.