Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Thursday December 10 2015, @04:46AM   Printer-friendly
from the how-much-better-can-it-get dept.

Following the recent reorganization of AMD's (Advanced Micro Devices) GPU assets under the Radeon Technologies Group (RTG), AMD is talking about its 2016 GPU plans.

2016 Radeon GPUs will support DisplayPort 1.3 & HDMI 2.0a. DisplayPort 1.3 will allow for 5K and even 8K resolution over a single cable, as well as 4K at the higher refresh rates (75+ Hz) needed for AMD's FreeSync Low Framerate Compensation to work. FreeSync will also work over HDMI (which is cheaper and more commonly used than DisplayPort):

Implemented over a customized version of HDMI 1.4a and utilizing a prototype Realtek timing controller, AMD was able to demonstrate variable refresh rate technology running over HDMI. At the time of the presentation AMD was very clear that the purpose of the presentation was to shop around the concept and to influence the various members of the HDMI consortium, but they were also clear that bringing variable refresh rate tech to HDMI was something the company wanted to bring to retail sooner than later. Sooner, as it turns out, was the operative word there. As part of their presentation last week, RTG has announced that FreeSync over HDMI will be heading to retail, and that it will be doing so very soon: Q1'16. This is just a year after the first DisplayPort adaptive sync monitors hit retail, which for a display technology is a rather speedy turnaround from proof of concept to retail product.

The first FreeSync-capable laptop, the Lenovo Y700, was announced by the RTG, however it only supports a small range from 40 Hz to 60 Hz.

[More after the break.]

AMD is also promoting its support for high dynamic range (HDR). HDR monitors will have increased contrast ratios, able to display using increased luminance for bright spots while displaying darker blacks:

Getting there is going to take a lot of effort. Content needs to be mastered, distributed and displayed differently than what we're accustomed to. This time next year, we'll hopefully see LCD-based screens able to hit 2000 nits of luminance—a big improvement over the 471 cd/m2 attained by the brightest UHD monitor we've reviewed thus far. But even that's a far cry from the BT.2020 color space.

Still, Kim Meinerth, senior fellow and system architect, assured us that AMD put significant effort into building a display pipeline that supports the Society of Motion Picture and Television Engineers' 10-bit ST 2084 electro-optical transfer function, which is much better at mimicking human vision than today's BT.1886 curve.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Friday December 11 2015, @12:38AM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday December 11 2015, @12:38AM (#274726) Journal

    Single card 4K is tough, and some PC games will be far more demanding than console titles even on lower resolutions.

    It seems to me that the game developers are getting better at making games that will run on a wide variety of machines at lower levels of detail, but can benefit from the latest hardware when running at "ultra".

    I see a lot of room for GPUs to improve. Detail levels and draw distances can always go up. 4K resolution will be desired by some and 4K displays (as well as 2560×1440) are getting cheaper. 5K, 6K, and 8K resolutions are all possible, even if overkill. Dell, Apple (iMac), HP, and Philips all offer a 5K (5120×2880) display. DisplayPort 1.3 supports 8K at 60 FPS. Multimonitor or widescreen adds to the GPU demand (someone out there has gamed at 11520×2160).

    Virtual reality headsets will have pretty high resolutions, and demand very high and stable framerates. Oculus Rift in Q1 2016 will start out at 2160×1200 and 90 FPS. I wouldn't be surprised to see it escalate to something like 4096×2400 and 120 FPS (more than 5 times more pixels per second). Some VR headsets are going even wider, like StarVR [tomshardware.com], which uses two 2560×1440 panels for a 210 degree FOV (compared to about 100-110 degrees for Oculus and HTC Vive. If you're wondering why anybody would "need" any more than 180 degrees FOV in their headset, it's because you can look to the extreme left and right with your eyes.

    For all of the above scenarios, doing it on one card is preferable to 2, 3, or 4 GPUs. Especially since 2 GPUs doesn't mean twice the performance... maybe 180% or something. Wait even longer, and the cheaper and lower powered single GPUs will be able to handle 4K, etc.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by RamiK on Friday December 11 2015, @01:11PM

    by RamiK (1813) on Friday December 11 2015, @01:11PM (#274950)

    There's always room to improve. Always room to optimize. But it doesn't matter since the developers are targeting the consoles primarily.
    As for the 2/4K stuff and beyond, it won't be affordable anytime soon. What drove the price drops in high res monitors were higher resolutions being a selling point in mobiles which made the factories upgrade their gear. However, beyond 300-400ppi (720-1080p at 5") 20% of the population physically can't tell the difference (hyperopia) at arms length, while the rest couldn't seem to care less as long as the text is readable and the cameras continue to have lots of noise at low light environments.
    As for VR being a driving force, assuming a doubling of performance every 6 month, it will take 2-3 years before GPUs and VR headsets that can deliver worthwhile quality at 60hz (headache free minimum for most) will hit the market. That is, curved 2x4" 3840x1024 (over 1000ppi) times 2 (for each eye) to cover the full 210° and the processing power to match. And that's assuming companies like Sony are targeting VR as their next consoles and are putting-up with zero-revenues R&D for a few years...

    --
    compiling...
    • (Score: 2) by takyon on Friday December 11 2015, @02:25PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday December 11 2015, @02:25PM (#274970) Journal

      There's always room to improve. Always room to optimize. But it doesn't matter since the developers are targeting the consoles primarily.

      That's where "ultra" detail presets and various user settings come in. They allow the PC master race to crank up detail far beyond what the consoles can support, while still supporting the consoles and gamers with weaker PC systems.

      The two examples I'd use are Skyrim and GTA V. Skyrim could be made to look great and demand a lot of resources, but it also ran smoothly on relatively low-end systems with everything turned down. GTA V PC players got a version that was delayed but supported significantly better graphics than console versions (definitely compared to last-gen, but I'm not so sure about PS4/XBO).

      There doesn't have to be "consolization" of PC games. A compromise can be reached and a wide variety of systems can be targeted.

      What drove the price drops in high res monitors were higher resolutions being a selling point in mobiles which made the factories upgrade their gear.

      Not precisely. I think the bigger panels are more expensive and harder to produce than the smaller panels, which has made it comparatively easy to make smartphones with insanely high PPI (such as above 800 PPI [soylentnews.org]). It also helps that smartphones are being sold at premium prices with more demand (although it is now slowing), while laptops and desktops are suffering extreme commoditization and lagging demand.

      However, beyond 300-400ppi (720-1080p at 5") 20% of the population physically can't tell the difference (hyperopia) at arms length

      I am optimistic that differences can be found until much higher resolutions. I don't have much evidence for that (I keep losing track of the relevant studies), but I note that AMD for one is looking at craziness like 16K resolution per eye for VR [soylentnews.org]. That comes from marketing materials and should be taken with some salt, but I'm buying it for now (while not spending any money on it).

      As for VR being a driving force, assuming a doubling of performance every 6 month, it will take 2-3 years before GPUs and VR headsets that can deliver worthwhile quality at 60hz (headache free minimum for most) will hit the market.

      That may be, but the companies seem to be going for it anyway and I would look to the opinions of people who have strapped VR to their face before dismissing the low/early adopter quality. That means 60 FPS on smartphone-based VR using Google Cardboard, and 90 FPS with Oculus Rift consumer version (Q1 2016). And the Oculus Rift recommended specs are:

      NVIDIA GTX 970 / AMD 290 equivalent or greater
      Intel i5-4590 equivalent or greater
      8GB+ RAM

      Not exactly breaking the bank. Could make a gaming PC for under $800 with all that they ask for.

      The obvious way to push the resolutions and Hz up is to sacrifice detail quality. Less tris and so on.

      There's also another way you can reap the benefits of high resolution and framerates without needing 2020's GPUs: video. Prerendered video is a lot easier on dedicated hardware than gaming with real time rendering. Which means that artistic VR and entertainment will be able to achieve 2160p120 before gaming at those resolutions and framerates is practical.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]