Graphics cards manufacturers like Nvidia and AMD have gone to great pains recently to point out that to experience Virtual Reality with a VR headset properly, you need a GPU capable of pushing at least a steady 90 FPS per eye, or a total of at least 180 FPS for both eyes, and at high resolutions to boot. This of course requires the purchase of the latest, greatest high-end GPUs made by these manufacturers, alongside the money you are already plonking down for your new VR headset and a good, fast gaming-class PC.
This raises an interesting question: virtually every LCD/LED TV manufactured in the last 5 — 6 years has a "Realtime Motion Compensation" feature built in. This is the not-so-new-at-all technique of taking, say, a football match broadcast live at 30 FPS or Hz, and algorithmically generating extra in-between frames in realtime, thus giving you a hypersmooth 200 — 400 FPS/Hz image on the TV set, with no visible stutter or strobing whatsoever. This technology is not new. It is cheap enough to include in virtually every TV set at every price level (thus the hardware that performs the realtime motion compensating cannot cost more than a few dollars in total). And the technique should, in theory, work just fine with the output of a GPU trying to drive a VR headset.
Now suppose you have a entry level or mid-range GPU capable of pushing only 40 — 60 FPS in a VR application (or a measly 20 — 30 FPS per eye, making for a truly terrible VR experience). You could, in theory add some cheap Motion Compensation circuitry to that GPU and get 100 — 200 FPS or more per eye. Heck, you might even be able to program a few GPU cores to run the motion compensating as a realtime GPU shader as the rest of the GPU is rendering a game or VR experience.
So my question: Why don't GPUs for VR use Realtime Motion Compensation techniques to increase the FPS pushed into the VR headset? Would this not make far more financial sense for the average VR user than having to buy a monstrously powerful GPU to experience VR at all?
(Score: 1) by anubi on Friday July 15 2016, @04:53AM
I am a little late on this topic, but while we are discussing displays and interpolation, does anyone know if a modern VGA monitor/display will fallback and display the old-school monochrome MDA images if I connect the sync and video lines up properly? ( Note the horizontal sync on those old babies was only 15,750 Hz or so, nowhere even close to a modern display ).
I still have and support some legacy monochrome systems, and would love to toss the old monochrome CRT's and use a modern display.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 1) by gOnZo on Friday July 15 2016, @11:59AM
I doubt it, though you might get lucky and find one that does. Bottom line, if there is a way to save a buck by dropping 'features' that very (VERY) few people use, especially when it is a retrograde feature, manufacturers opt for saving the buck. There are forces at work too, but does it make sense for Joe Consumer to purchase a brand new color VGA monitor, and then connect it to their old mono graphic video card? Not much of a selling point.
I'm just amazed you can still get 5.25" floppy drives to work...
...or has calcification now re-classified these as 'hard' drives?
(Score: 1) by gOnZo on Friday July 15 2016, @12:27PM
The DVI limitation he refers to means that you have to ensure the monitor has either the standard HD DB15 VGA connector, OR a DVI-A (analog capable DVI connector)+ a DVI-to-VGA adapter (purely mechanical=cheap).
(Score: 1) by anubi on Saturday July 16 2016, @05:03AM
Thanks for the YouTube links. I had seen those offboard converter boxes, however I figured while someone had a frame scaler running where it would take every VGA mode out there and scale it to the particular LCD they used, and the new LCD's having far more resolution than the old MDA, they would just throw it in. At least the edges of the sync pulses are consistent, and any positioning of the video content ( phasing to sync ) would be done by the customer, such setup saved like any other setup.
Kinda like playing an old retro Atari game on a modern PC... but in my case its old stuff like CAM machines and several of my old DOS tools ran a debug screen on mono while the program itself wrote to VGA.
I am loathe to part with my old DOS stuff, as I consider at least I understand and trust my old stuff far more than I trust this new stuff that comes pre-loaded with malware I cannot remove. It may be like comparing a bike to a car, but if the government comes in and forces cars to be licensed, but the bikes are not, then someone else has control over my ride if I choose a car - then the use of the car denied me if I fail to do something someone else wants me to do.
I already see strong economic forces at work, working with my government to shield them from lawsuits should they decide to use their computing systems to enforce their business model, while holding me as a criminal if I work around it. Most of the stuff I do, I do not need pretty pictures or CPU intensive stuff... rather most of it is quite simple robotics type stuff.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]