Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by martyb on Saturday October 12 2019, @01:20AM   Printer-friendly
from the for-some-values-of-phenomenal dept.

We Played Modern Games on a CRT Monitor - and the Results are Phenomenal :

It's true. Running modern games on a vintage CRT monitor produces absolutely outstanding results - subjectively superior to anything from the LCD era, up to and including the latest OLED displays. Best suited for PC players, getting an optimal CRT set-up isn't easy, and prices vary dramatically, but the results can be simply phenomenal.

The advantages of CRT technology over modern flat panels are well-documented. CRTs do not operate from a fixed pixel grid in the way an LCD does - instead three 'guns' beam light directly onto the tube. So there's no upscaling blur and no need to run at any specific native resolution as such. On lower resolutions, you may notice 'scan lines' more readily, but the fact is that even lower resolution game outputs like 1024x768 or 1280x960 can look wonderful. Of course, higher-end CRTs can input and process higher resolutions, but the main takeaway here is that liberation from a set native resolution is a gamechanger - why spend so many GPU resources on the amount of pixels drawn when you can concentrate on quality instead without having to worry about upscale blurring?

Are there any Soylentils here who still use a CRT for gaming? If I could just find a CRT with a 65-inch diagonal, and a table that could support the weight...


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Interesting) by Anonymous Coward on Saturday October 12 2019, @02:06AM (6 children)

    by Anonymous Coward on Saturday October 12 2019, @02:06AM (#906162)

    I used CRTs as long as it was reasonably practical, but newer GPUs generally don't have VGA support any more (even over the DVI port) so you would have to build an adapter circuit (which wouldn't be that hard, but a lot harder than just connecting a plug). My 2012 Radeon 7950 had VGA, my 2017 RX580 does not.

    CRTs have one remaining quality advantage which is that they do not have any motion blur. It does not matter how fast the LCD transition time is, they will always have motion blur as long as they use always-on backlight. Motion blur is caused by seeing two images at once. CRTs only ever display pixels from one frame or another - never a blurred pixel that is half frame A, half frame B. But if LCDs would shut off the backlight while the pixels transition they would eliminate this problem (but they'd need a much more powerful backlight, it's not a simple problem, and yes, at low refresh rates this would cause eyestrain from flicker, just like CRTs). Because of the potential for flicker, this should only happen in games, so it will probably only be a feature on the high end gaming monitors.

    But LCDs do have plenty of advantages themselves. All modern GPUs have enough performance to reach at least 1080p resolution, so flexible resolutions are no longer meaningful. Yes, older games don't necessarily have the right resolution, but this is a problem that can be solved in software, even at the driver level, by upscaling. Gaming monitors no longer have significant display lag. In fact, today's gaming LCDs might have less lag in practice than CRTs did. This is because, while an LCD will always have about half a frame of display lag, the CPU is always a couple of frames ahead of the GPU's video output (four is typical, two is the practical minimum). Increasing the frame rate lowers the effective lag in the driver and GPU by reducing how much real time those few frames of lag represent.

    Another advantage for LCDs is that they don't have to run at a fixed refresh rate. CRTs can change their refresh rate of course, but it takes a couple of seconds to do it. LCDs have Freesync (or the inferior proprietary licensed version G-sync) which allows them to update the display whenever data is ready. CRTs had only the choice of v-sync, which cost a frame of lag, and no v-sync, which caused frame tearing.

    As far as image quality goes, both have advantages. CRTs have better black depth and color uniformity. LCDs have perfect geometry and focus. I call this a wash, but there's room for personal preference here.

    I loved CRTs, still have a few for my retrocomputers, and used them on my main PC long after most people had switched. But there's really no technological reason to stay with them any more.

    The only thing I ask is that if you have a good quality CRT, don't recycle it. Give it to a retrocomputing enthusiast, especially if it's a monitor for a specific older system (Apple II, C64, Amiga, early Mac, etc.).

    Starting Score:    0  points
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  

    Total Score:   2  
  • (Score: 2, Interesting) by Anonymous Coward on Saturday October 12 2019, @02:14AM

    by Anonymous Coward on Saturday October 12 2019, @02:14AM (#906166)

    > if you have a good quality CRT, don't recycle it. Give it to a retrocomputing enthusiast,

    About 10 years ago we replaced an old 25" Trinitron TV. I was about to put out with the trash, but put on Craiglist instead for $25. It went right away to a retro gamer--it seems that they've been around for some time.

  • (Score: 2) by takyon on Saturday October 12 2019, @02:19AM (1 child)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday October 12 2019, @02:19AM (#906168) Journal

    https://www.youtube.com/watch?v=V8BVTHxc4LM [youtube.com]

    I recall the video (watched it earlier today) claiming no tearing with CRT technology or maybe just the Sony FW900, which they described as having a refresh rate up to 165 Hz (also might not be accurate [cnet.com]).

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 1, Informative) by Anonymous Coward on Saturday October 12 2019, @03:24AM

      by Anonymous Coward on Saturday October 12 2019, @03:24AM (#906205)

      The FW900 was a phenomenal piece of equipment (and very expensive, when it was new and even now, relatively speaking). But 165Hz at full resolution wasn't possible.

      People who don't understand how CRTs work sometimes look at the maximum resolution and the maximum refresh rate and assume you can use both at once, but you can't. You are limited by the analog bandwidth of the monitor and the horizontal frequency of the electron beam. Analog bandwidth isn't a hard limit, but it does represent the maximum performance before image quality will degrade. You can turn a crisp image at 60Hz into a blurry one at 100Hz, even if the resolution is the same. Horizontal frequency (the maximum rate of scanlines the beam can generate), on the other hand, is usually a hard limit, the monitor won't sync more than a percent or two above that.

      The FW900 manual doesn't sspecify a bandwidth, but it lists a mode with 380MHz, let's round up to 400. That's very good. At 1920x1200 resolution that's about 120Hz. However, it would be limited by its H-frequency to about 100Hz. It does imply that you could squeeze a little more resolution out of it without losing much image quality. The monitor lists a mode of 2304x1440@80Hz (it's 16x10 aspect ratio rather than today's typical 16x9) using the maximum 120kHz H-frequency, which is probably the best tradeoff of resolution, refresh rate and quality.

      Today's LCDs can easily do 1080p at 120Hz, and 4K@120Hz or 1080p@240Hz are available.

      Of course this is a monitor from 2003. If CRTs were still being made for PC monitors, they'd probably have adopted today's high speed data link connections, eliminating analog bandwidth, but the horizontal frequency would still be a problem - this is still analog circuitry at that point.

  • (Score: 0) by Anonymous Coward on Saturday October 12 2019, @04:30AM

    by Anonymous Coward on Saturday October 12 2019, @04:30AM (#906233)

    For 5-10 dollars extra you can get a female HDMI to male DVI-D adapter to plug onto the DVI-D port on a modern GPU (unless it is displayport) and connect it that way.

    I've sued these to do dual/triple head VGA monitors off a GT630 card. Works great.

    Point is: The adapters are still out there. Some of them are even really cheap (Thanks Chinese entrepeneurs/clone makers!)

    If you really want to you can have any mix and match of hardware you want, if you have the money and patience and testing for the right chain of adapters. I use such chains every day. You can took if it beats your alternatives.

  • (Score: 0) by Anonymous Coward on Saturday October 12 2019, @04:53AM (1 child)

    by Anonymous Coward on Saturday October 12 2019, @04:53AM (#906236)

    Weird you think that VGA is not doable on modern cards without significant effort. DisplayPort to VGA adapters are readily available and cheap.

    • (Score: 0) by Anonymous Coward on Sunday October 13 2019, @10:47AM

      by Anonymous Coward on Sunday October 13 2019, @10:47AM (#906585)

      To address both of the above responses :
      HDMI doesn't have any provision for analog signals. You can't go from a passive HDMI/DVI adapter, to a passive DVI/VGA adapter, and have it work. The analog pins won't be connected to anything.

      DisplayPort to VFA adapters exist, but they're no good. They're intended for driving VGA-based projectors from laptops with no VGA output. None of them have the resolution or refresh rate for driving a CRT desktop monitor with any kind of acceptable quality. Most are locked to 60Hz.