Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Friday September 21 2018, @03:59AM   Printer-friendly
from the moar-pixels! dept.

[Update: WOW! Thanks for all the useful feedback! Plenty of information on the TV-as-a-monitor side of things (but feel free to add more!) Would very much appreciate it if folks could provide some input as to what has worked for them in using a laptop to drive a 4K display. I'd consider a used system. Would, ideally, like something that costs in the ~$300 range, but am resigned to the fact I may have to kick out more like ~$750. What graphics adapter do you have. Is it an integrated model (e.g. Intel HD 630) or discrete card? What model? What troubles, if any, have you had with getting proper drivers (windows OR Linux/Debian/BSD/etc.) Could you get the full 60 fps or were you limited to 30 fps? See below the fold for details on my current system and what my needs are compute-wise. --Bytram]

Summary: I need more screen space.

Which means I'll need a new (to me) laptop (portability++) which can support more pixels. I want a system that is Linux/BSD friendly. I don't have a whole lot of money to spend, so I'm hoping I can draw on the experience of my fellow Soylentils to help point me in the right direction. I'd like to avoid overspending, but I don't want to find that I've boxed myself into a corner for making an ignorant mistake.

I used to follow the bleeding edge of technology, but I've now firmly moved into the "I want it to just work" camp.

Current Display: I have a 24-inch, 1920x1200 computer monitor. The majority of my display is taken up my Internet Browser (Pale Moon) which generally has 50+ tabs. It is flush with the top of my screen and covers the entire display except for a ~2 inch margin on the sides and 3 inches on the bottom. That overlays my HexChat IRC (Internet Relay Chat) which runs across the bottom 1/3 of my screen. The remainder of the screen has corners of command windows poking out as well as various utilities like an analog clock, performance monitor, connection monitor, etc.

TV as Monitor: Over the past few months I've seen the prices for 4K (3840x2160) televisions plummet. I've got my eye on a TCL 43S517 43-Inch 4K Ultra HD Roku Smart LED TV (2018 Model) which Amazon has on sale for $349.00 with free shipping.

As I see it, I could get a display with better dot pitch than what I have now, and much more screen real estate, for relatively little money.

The vast majority of what I do is command line based, be it in a Windows (7 Pro X64) CMD.exe command window, or an occasional PuTTY session into Soylent's Servers. I do not do any video gaming. My only video needs are an occasional short clip from YouTube, or a DVD (I have neither cable TV nor do I stream video with Netflix or their ilk; no Blu-ray, either). Internet access is currently via a tethered LTE cell phone.

Current computer: Thanks to the generosity of a fellow Soylentil, my current system is a Dell Latitude E6400 with a Core 2 Duo P8700 (1.8-2.5 GHz) with 8GB RAM and a 500GB 7200-rpm WD Black disk drive. Video is handled by a NVIDIA Quadro NVS 160M.

New Laptop: My current is not going to cut it. So, I'm also on the lookout for a new (to me) laptop. I don't need much in the way of compute power. I figure pretty much any i3 or i5 should be more than enough for my computing needs. And, an Intel integrated graphics chip should be up to the task given a recent enough generation, but I'm not sure how current a model I'd need. I'm further confused by the different connection schemes and versions. I've found this page on Intel. What will I need? HDMI 1.4? Display Port 1.2? Other? Would I be able to run both a 4K monitor @ 60Hz and my existing 1920x1200 display?

With the increasing trade war rhetoric, I'm getting nervous there may be a price spike in the not too distant future. Further, I sense merchants are clearing out the current stock in anticipation of the holiday season, so I'm thinking the time is right for me to take the plunge and upgrade.

Conclusion: So, what have your experiences been using a 4K television as a computer monitor? What 'gotchas' have you run into? What things did you learn the hard way that you wish someone had told you about beforehand? What driver problems have you encountered? Did you have any issues with Linux/BSD drivers? What worked for you?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Friday September 21 2018, @12:22AM (17 children)

    by Anonymous Coward on Friday September 21 2018, @12:22AM (#737884)

    First, to be clear, I've done this with a LG 48" 4k screen. This screen actually supports 4096x2160. It's price was approximately what you're planning on spending.

    Pros:
    1. 3d gaming becomes super sharp. So sharp you can turn off the anti-aliasing and not notice the jaggies. This, in my opinion, is the only reasonable use case.
    2. Astounding amounts of text in your favorite editor at 1:1 with "normal" font sizing (12-16pt). (over 255x100 characters in a full screen console).
    3. Snapping applications to the corners makes them generally act like they're on their own 1080p desktop.
    4. Your desktop wallpaper will look better than it ever has. Especially earth-porn.

    Cons:
    1. If you want 1:1 pixel ratios for your desktop, with no scaling, expect your nice 12pt font to be crazy tiny on the screen. Hard on the eyes.
    2. Text near the borders of the screen (about 1cm worth) tend to wash out and is difficult to see unless you move your head to view the edges closer to a 90deg angle.
    3. Red pixels at 1:1 (at least on my screen) bleed horribly, to the point where text becomes illegible.
    4. Most 4k screens that are large have a intended viewing distance of about 2 meters. Most 1080p PC monitors, about 1 meter. At 1 meter, you really cannot see the entire screen and keep it visually in focus.
    5. You're really trying to display 4x 1080p screens at the same time. This affects your frame buffer size and will lower GPU performance a little. Just plugging the TV into my PC made the GPU's cooling fans spin.

    Needless to say, I went back to using 1080p screens rather quickly.

    Now, as for the Roku embedded smartness. Meh. It's fine. Not great, but just mediocre. Get a Shield TV instead. Even though it's another 150-200$, it's a significantly better streaming device, with excellent Android compatibility. Kodi works out of the box, and adding Android based emulations is as easy as installing from the Play store. If you've got a Nvidia equipped PC, you can stream just about anything from the PC to the Shield, including the desktop, if that's your fancy.

    Never, ever plug that TV into your network. Just don't. It could easily spy on you.

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 5, Interesting) by jmorris on Friday September 21 2018, @12:53AM (11 children)

    by jmorris (4844) on Friday September 21 2018, @12:53AM (#737905)

    You discovered the problem with trying to use a TV as a monitor. Unless you can get the very latest HDMI port on both monitor and PC you can't really display 4K on it. They had to cheat, you have to give something up to stay inside the bandwidth limit. Usually what goes is full RGB in favor of 4:1:1 component. 60Hz refresh also has to go. When you run the math the bandwidth for 30Hz 4:4:1 at 2160 is exactly the same as 60Hz 4:4:4 at 1080.

    DisplayPort could apparently do the real thing a little earlier but again, make sure the display AND video source can do what you need. And if you want to run a browser or a terminal emulator you want 2160p with 4:4:4 RGB. And if you play games you want 60fps, preferably without having to shift video modes. Otherwise text will look like crap unless you stick to strictly black on white or white on black. Single pixels can't have a color with 4:1:1 so you get the bleeding you describe.

    The next problem with TVs as monitors is overscan. If you buy the good name brand stuff you can usually switch it off or it does it automatically if you feed RGB but the house brands often remove those features as "incentives" to upgrade.

    DPMS is another monitor feature often missing from TV sets, which you want to make sure you get.

    • (Score: 2) by martyb on Friday September 21 2018, @02:47AM (8 children)

      by martyb (76) Subscriber Badge on Friday September 21 2018, @02:47AM (#737945) Journal

      I'd heard bits and pieces of that (HDMI / DisplayPort) over the years, but never put together as clearly as this; thanks!

      Overscan? With an LCD? Really??!!? I thought that was a holdover from the old CRT days were they needed to have some "fudge factor" so the image would fit "properly" within the bezel. And that applies to LCDs? Huh! Thanks for that!

      As for DPMS (Display Power Management Signaling), I would never have thought about checking for that. Thanks so much for mentioning it!

      --
      Wit is intellect, dancing.
      • (Score: 2, Informative) by Anonymous Coward on Friday September 21 2018, @04:26AM (2 children)

        by Anonymous Coward on Friday September 21 2018, @04:26AM (#737968)

        Overscan? With an LCD? Really??!!?

        With a TV LCD, yes. All LCD's labeled for "Television" actually perform overscan.

        Now, I can anticipate your next question: "why?"

        The answer is because of "legacy". Television broadcast signals were created presuming overscan in the receiver (because it was a CRT and all but studio 'monitor' CRT's were setup to overscan) and so all "TV" signals are produced assuming that the receiver overscans.

        Also, because TV's overscanned, other, enterprising, folks decided they could use the overscan area for transmitting other data (it, afterall, would be invisible to the viewer, because "overscan"). So if modern LCD TV's didn't overscan to compensate for the signal assuming it does so, you'd have your tv picture sitting in a box with all kinds of moving dots around it where other 'stuff' (one big one that did this was the text overlays for the hearing impaired, they get transmitted on several of the lines at the bottom of the picture that are in the "overscan zone". And most "joe sixpack's" wouldn't understand why their "movin pitcur" has all these 'ants' crawling around on the edges.

        The same reason is the cause for why most TV's now have "zoom" modes that cut off the edges of the picture so it "fills the screen" because "joe sixpack" could not understand that a film aspect viewport was not the same aspect ratio as his newfangled "movin pitcur box" and those black bars on the edges did not really mean that he wasn't getting "all the pitcur he paid fo".

        • (Score: 2) by martyb on Friday September 21 2018, @02:31PM

          by martyb (76) Subscriber Badge on Friday September 21 2018, @02:31PM (#738133) Journal

          TIL! Thanks so much for the clear explanation of the history and cause for the overscan. Makes perfect sense the way you explained it. It's replies like this which make all the time I put in posting stories and doing QA worthwhile. Thank you!

          --
          Wit is intellect, dancing.
        • (Score: 1) by ChrisMaple on Saturday September 22 2018, @01:07AM

          by ChrisMaple (6964) on Saturday September 22 2018, @01:07AM (#738436)

          I've seen signalling in the flyback (blanking) time, but having signalling in the overscan region seems quite peculiar. After all, overscan in CRT systems is typical, not guaranteed.

      • (Score: 2) by kazzie on Friday September 21 2018, @04:29AM (1 child)

        by kazzie (5309) Subscriber Badge on Friday September 21 2018, @04:29AM (#737969)

        My Raspberry Pi has overscan settings [stackexchange.com] in its config to compensate for this. I've had to use them on my HDMI LCD.

        • (Score: 2) by martyb on Friday September 21 2018, @02:34PM

          by martyb (76) Subscriber Badge on Friday September 21 2018, @02:34PM (#738137) Journal

          Good to know! Thanks for that... The more I'm learning about this, the more I'm learning how much I did not know. =)

          --
          Wit is intellect, dancing.
      • (Score: 2) by jmorris on Friday September 21 2018, @04:56AM (2 children)

        by jmorris (4844) on Friday September 21 2018, @04:56AM (#737974)

        Lack of DPMS is common. No idea why. Even signage displays often lack it. Had to use the serial port to control power on some LG screens.

        • (Score: 2) by martyb on Friday September 21 2018, @02:57PM (1 child)

          by martyb (76) Subscriber Badge on Friday September 21 2018, @02:57PM (#738157) Journal

          Orly? Sense No Makes. Well, actually, anything to save a few pennies and make it look less expensive than the competition and thereby win a sale. Caveat emptor, once again!

          Thanks for the warning!

          --
          Wit is intellect, dancing.
          • (Score: 2) by jmorris on Friday September 21 2018, @03:43PM

            by jmorris (4844) on Friday September 21 2018, @03:43PM (#738181)

            But these were high dollar units with serial ports and daisy chainable VGA in/out ports for building a video wall. It is what makes no sense, why do they omit basic features like DPMS on anything that isn't explicitly sold for a desktop PC?

            Even more fun, my older LG TV set at home also has a serial port but since it is a consumer product it is gimped. You can send commands to it, you can even power it down over the serial port. If you knock the secret knock you can get a root console over it. But the serial port is connected to the main SoC which is powered down once you send a power off code. The power button and IR sensor are connected to a little 8051 microcontroller and it is responsible for bringing the power supply back on and it has no link to the serial port. Dumbness level of that is at least several hundred milli-Kohns. (I stole that gag from Ace at Ace of Spades HQ, he uses Sally Kohn at the NYT as a "reference dumb level" and rates stupid utterances, articles and other dumb things in milli-Kohns.)

    • (Score: 0) by Anonymous Coward on Friday September 21 2018, @03:16PM (1 child)

      by Anonymous Coward on Friday September 21 2018, @03:16PM (#738165)

      The next problem with TVs as monitors is overscan. If you buy the good name brand stuff you can usually switch it off or it does it automatically if you feed RGB but the house brands often remove those features as "incentives" to upgrade.

      I have a $120 TV and it has that feature. I think all modern ones must have such a feature.

      • (Score: 2) by jmorris on Friday September 21 2018, @03:50PM

        by jmorris (4844) on Friday September 21 2018, @03:50PM (#738188)

        I tried an Insignia (Worst Buy's house brand) and it lacked setting for overscan and discrete IR codes to select inputs, power on/off, etc. Read online that defeaturing house brands is a fairly common thing. Took that set back the next day. I use a hacked remote with extensive macros and such to make the home theater and MythTV all play nice. If one knows where to look you can find the tools to take the bog standard remotes and make them useful, much nicer than those Logitech horrors that more resemble a tablet and a remote's unholy spawn. But key to any smart remote is discrete IR codes.

  • (Score: 2) by martyb on Friday September 21 2018, @01:10AM (4 children)

    by martyb (76) Subscriber Badge on Friday September 21 2018, @01:10AM (#737916) Journal

    Wow! Thanks for all of that!

    Yes, I had already concluded that anything larger than 43 inches would be problematic for me; thanks for confirming that. I'd not thought of the issues with different colors having different 'visibility' -- thanks!

    My current system with a 1920x1200, 24-inch monitor is setup to create cmd.exe command windows with an 8x12 font arranged as 55 rows of 222 characters each. (I never seem to be able to get as many characters visible at one time as I like and have been willing to put up with reduced legibility to achieve that.)

    Maintaining focus across a large display... I'd thought of that and it's why I am looking into getting a 40-43 inch display instead of a 48+ inch display. Though only a small increment in price, I figured it would exacerbate the problem of keeping everything in focus at one time. Thanks for confirming that!

    Equivalence to 4 1080p screens... intellectually I was aware of that, but had not thought that it would be sufficient to drive up fan noise because of it... thanks!

    As for videos, I don't have cable TV. I don't have Netflix or any other streaming media. Just a rare DVD or occasional downloaded YouTube video. So, I appreciate the cautions but do not think it is applicable in my particular case. I'm sure there are others who are contemplating making this move, and for them this is a much more important factor, so thanks for bringing it up!

    Yes, never ever plug into my home network. Got it. Also, don't get it wet and don't feed it after midnight. ;)

    --
    Wit is intellect, dancing.
    • (Score: 1, Informative) by Anonymous Coward on Friday September 21 2018, @02:07AM (3 children)

      by Anonymous Coward on Friday September 21 2018, @02:07AM (#737934)

      Make sure your HDMI cable doesn't support ethernet-over-hdmi either.

      • (Score: 2) by martyb on Friday September 21 2018, @03:42AM

        by martyb (76) Subscriber Badge on Friday September 21 2018, @03:42AM (#737959) Journal

        Make sure your HDMI cable doesn't support ethernet-over-hdmi either.

        Gack! Would never have thought of that... thanks!!!

        --
        Wit is intellect, dancing.
      • (Score: 0) by Anonymous Coward on Friday September 21 2018, @06:36AM (1 child)

        by Anonymous Coward on Friday September 21 2018, @06:36AM (#737996)

        How am I supposed to make sure my HDMI cable doesn't support ethernet (aka: HEC) when it has been part of the standard since v1.4? Even the super cheap AmazonBasics cables support it. So everyone who thought they were protected because they didn't let their smart TV connect the WiFi but still connected their computer/xbox/playstation/etc with an HDMI cable made after ~2010 were unknowingly providing the TV with a wired network connection. DOH!

        If you want to avoid ethernet over HDMI, the only option is to verify the TV itself doesn't support HEC. It costs extra to build in support and some manufacturers opted to lower BOM costs rather than provide it. However that may have changed...

        • (Score: 2) by martyb on Friday September 21 2018, @03:02PM

          by martyb (76) Subscriber Badge on Friday September 21 2018, @03:02PM (#738158) Journal
          Excellent point. I would like to think that there would be a way on my computer to enable/disable that ethernet connection over the HDMI cable? Just guessing here, but it seems like an obvious thing to be able to do. Can anyone confirm one way or the other?
          --
          Wit is intellect, dancing.