Following the recent reorganization of AMD's (Advanced Micro Devices) GPU assets under the Radeon Technologies Group (RTG), AMD is talking about its 2016 GPU plans.
2016 Radeon GPUs will support DisplayPort 1.3 & HDMI 2.0a. DisplayPort 1.3 will allow for 5K and even 8K resolution over a single cable, as well as 4K at the higher refresh rates (75+ Hz) needed for AMD's FreeSync Low Framerate Compensation to work. FreeSync will also work over HDMI (which is cheaper and more commonly used than DisplayPort):
Implemented over a customized version of HDMI 1.4a and utilizing a prototype Realtek timing controller, AMD was able to demonstrate variable refresh rate technology running over HDMI. At the time of the presentation AMD was very clear that the purpose of the presentation was to shop around the concept and to influence the various members of the HDMI consortium, but they were also clear that bringing variable refresh rate tech to HDMI was something the company wanted to bring to retail sooner than later. Sooner, as it turns out, was the operative word there. As part of their presentation last week, RTG has announced that FreeSync over HDMI will be heading to retail, and that it will be doing so very soon: Q1'16. This is just a year after the first DisplayPort adaptive sync monitors hit retail, which for a display technology is a rather speedy turnaround from proof of concept to retail product.
The first FreeSync-capable laptop, the Lenovo Y700, was announced by the RTG, however it only supports a small range from 40 Hz to 60 Hz.
[More after the break.]
AMD is also promoting its support for high dynamic range (HDR). HDR monitors will have increased contrast ratios, able to display using increased luminance for bright spots while displaying darker blacks:
Getting there is going to take a lot of effort. Content needs to be mastered, distributed and displayed differently than what we're accustomed to. This time next year, we'll hopefully see LCD-based screens able to hit 2000 nits of luminance—a big improvement over the 471 cd/m2 attained by the brightest UHD monitor we've reviewed thus far. But even that's a far cry from the BT.2020 color space.
Still, Kim Meinerth, senior fellow and system architect, assured us that AMD put significant effort into building a display pipeline that supports the Society of Motion Picture and Television Engineers' 10-bit ST 2084 electro-optical transfer function, which is much better at mimicking human vision than today's BT.1886 curve.
Related Stories
Many sites are reporting on the release of AMD's Crimson driver, which replaces Catalyst, features a redesigned interface and gives DirectX 9 applications access to variable framerates:
AMD’s first tease of Crimson was a run-through of the slick new Radeon Settings hub designed to replace Catalyst Control Center. (R.I.P.) At the time, AMD revealed some of the overt new features in Radeon Settings, such as per-game OverDrive overclocking settings and one-click Eyefinity multi-monitor configuration. On Tuesday, AMD’s unwrapping the deeper-level goodies in Radeon Software Crimson—with handy features for new and old graphics cards alike—and pushing the drivers live so you can try them out for yourself.
Crimson officially supports Windows 7—10. Linux users can expect a new, partially-open-source driver (AMDGPU) sometime in the future, but only for the latest, shiniest of graphics cards. The current driver's performance has been improved somewhat.
3D and 4K were nothing! It's all about HDR now!
Netflix has confirmed it has begun its rollout of high dynamic range content on its TV and film streaming service. HDR videos display millions more shades of colour and extra levels of brightness than normal ones, allowing images to look more realistic.
However, to view them members will need a new type of TV or monitor and a premium-priced Netflix subscription. Some HDR content had already been available via Amazon's rival Instant Video service. Ultra-high-definition 4K Blu-ray discs - which launched in the UK earlier this week - also include HDR data.
Netflix's support follows January's creation of a scheme defining the HDR standards a television set must meet to be marketed with an "Ultra HD Premium" sticker. [...] The US firm recommends its members have at least a 25 megabits per second connection to view them.
High-dynamic-range imaging at Wikipedia.
Related:
A Look at AMD's GPU Plans for 2016
LG to Demo an 8K Resolution TV at the Consumer Electronics Show
(Score: 2) by EQ on Thursday December 10 2015, @07:07AM
news for die shrink. Now that their 20nm failed, where's the 14 - not in 2016? Are we stuck at 28 seemingly forever? What about reducing power usage and heat? all this about increased color is nice, but that's always been more of an issue with the panels and interfaces, not the processing.
(Score: 3, Informative) by takyon on Thursday December 10 2015, @08:57AM
It will almost certainly be 14/16nm in 2016.
But look at it this way. Even stuck at 28nm for several years, AMD and NVIDIA managed to make vast improvements in their GPUs. While CPUs, especially Intel, have choked in anything but lower power consumption as they have gotten closer to 10nm.
http://www.extremetech.com/computing/199101-amd-nvidia-both-skipping-20nm-gpus-as-tsmc-plans-massive-16b-fab-investment-report-says [extremetech.com]
http://hexus.net/tech/news/graphics/83801-nvidia-tsmc-allegedly-tape-next-gen-14nm-pascal-gpu/ [hexus.net]
http://www.extremetech.com/extreme/203318-samsung-nvidia-will-supposedly-collaborate-on-14nm-gpus-but-on-what-process-node [extremetech.com]
http://wccftech.com/amd-greenland-gpus-feature-hbm2-14nm-coming-2016/ [wccftech.com]
http://www.techpowerup.com/212316/amd-readies-14-nm-finfet-gpus-in-2016.html [techpowerup.com]
FLOPS/W will double. Both companies will be using High Bandwidth Memory 2.0 on some of their GPUs, if not all of them. They will be great products.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by RamiK on Thursday December 10 2015, @09:14PM
that's always been more of an issue with the panels and interfaces, not the processing.
Then why begin the post asking about the die shrink?
Regardless, it doesn't matter. All the new games target consoles first anyhow. Just pick up any half decent AMD GPU and it should give a solid frame-rate unless you're turning on special nVidia features. Like Hairworks (Witcher) and Godrays (Fallout)... You know, the stuff you wouldn't even notice on 4k unless someone told you to look out for them.
compiling...
(Score: 2) by Hyperturtle on Thursday December 10 2015, @11:09PM
Yeah, I never understood various people's insistence about mhz numbers and transistor die count size in nm or whatever.
Does it work and does it work well for me a as a consumer -- these are what I want to know.
I pretend to think that ghz is ghz and can be compared between CPUs, but that has not been true since 386s and 486s were on the market -- and even prior to that. A Cyrix is not an AMD is not an Intel and the MHz listed may faster or slower than the CPU placed next to it with the same number from a different brand even if manufactured the same year. Some architectures are more efficient than others.
I agree none of that matters unless it requires a giant incandescent heat sink and fan combo that makes playing games with it (or worse, doing actual work on it) unpleasant for the user due to the heat and noise. Smaller transistor counts just seems to make things hotter, but it may be that I never have bought a low performance device that generated no heat.
As an engineer, I always seem to buy a cutting edge that makes me bleed and I have to set up alternative cooling methods to get it to remain stable (otherwise I would just run the wizard like a regular person and complain that the devs need to optimize the code or something).
(Score: 2) by takyon on Friday December 11 2015, @12:38AM
Single card 4K is tough, and some PC games will be far more demanding than console titles even on lower resolutions.
It seems to me that the game developers are getting better at making games that will run on a wide variety of machines at lower levels of detail, but can benefit from the latest hardware when running at "ultra".
I see a lot of room for GPUs to improve. Detail levels and draw distances can always go up. 4K resolution will be desired by some and 4K displays (as well as 2560×1440) are getting cheaper. 5K, 6K, and 8K resolutions are all possible, even if overkill. Dell, Apple (iMac), HP, and Philips all offer a 5K (5120×2880) display. DisplayPort 1.3 supports 8K at 60 FPS. Multimonitor or widescreen adds to the GPU demand (someone out there has gamed at 11520×2160).
Virtual reality headsets will have pretty high resolutions, and demand very high and stable framerates. Oculus Rift in Q1 2016 will start out at 2160×1200 and 90 FPS. I wouldn't be surprised to see it escalate to something like 4096×2400 and 120 FPS (more than 5 times more pixels per second). Some VR headsets are going even wider, like StarVR [tomshardware.com], which uses two 2560×1440 panels for a 210 degree FOV (compared to about 100-110 degrees for Oculus and HTC Vive. If you're wondering why anybody would "need" any more than 180 degrees FOV in their headset, it's because you can look to the extreme left and right with your eyes.
For all of the above scenarios, doing it on one card is preferable to 2, 3, or 4 GPUs. Especially since 2 GPUs doesn't mean twice the performance... maybe 180% or something. Wait even longer, and the cheaper and lower powered single GPUs will be able to handle 4K, etc.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by RamiK on Friday December 11 2015, @01:11PM
There's always room to improve. Always room to optimize. But it doesn't matter since the developers are targeting the consoles primarily.
As for the 2/4K stuff and beyond, it won't be affordable anytime soon. What drove the price drops in high res monitors were higher resolutions being a selling point in mobiles which made the factories upgrade their gear. However, beyond 300-400ppi (720-1080p at 5") 20% of the population physically can't tell the difference (hyperopia) at arms length, while the rest couldn't seem to care less as long as the text is readable and the cameras continue to have lots of noise at low light environments.
As for VR being a driving force, assuming a doubling of performance every 6 month, it will take 2-3 years before GPUs and VR headsets that can deliver worthwhile quality at 60hz (headache free minimum for most) will hit the market. That is, curved 2x4" 3840x1024 (over 1000ppi) times 2 (for each eye) to cover the full 210° and the processing power to match. And that's assuming companies like Sony are targeting VR as their next consoles and are putting-up with zero-revenues R&D for a few years...
compiling...
(Score: 2) by takyon on Friday December 11 2015, @02:25PM
That's where "ultra" detail presets and various user settings come in. They allow the PC master race to crank up detail far beyond what the consoles can support, while still supporting the consoles and gamers with weaker PC systems.
The two examples I'd use are Skyrim and GTA V. Skyrim could be made to look great and demand a lot of resources, but it also ran smoothly on relatively low-end systems with everything turned down. GTA V PC players got a version that was delayed but supported significantly better graphics than console versions (definitely compared to last-gen, but I'm not so sure about PS4/XBO).
There doesn't have to be "consolization" of PC games. A compromise can be reached and a wide variety of systems can be targeted.
Not precisely. I think the bigger panels are more expensive and harder to produce than the smaller panels, which has made it comparatively easy to make smartphones with insanely high PPI (such as above 800 PPI [soylentnews.org]). It also helps that smartphones are being sold at premium prices with more demand (although it is now slowing), while laptops and desktops are suffering extreme commoditization and lagging demand.
I am optimistic that differences can be found until much higher resolutions. I don't have much evidence for that (I keep losing track of the relevant studies), but I note that AMD for one is looking at craziness like 16K resolution per eye for VR [soylentnews.org]. That comes from marketing materials and should be taken with some salt, but I'm buying it for now (while not spending any money on it).
That may be, but the companies seem to be going for it anyway and I would look to the opinions of people who have strapped VR to their face before dismissing the low/early adopter quality. That means 60 FPS on smartphone-based VR using Google Cardboard, and 90 FPS with Oculus Rift consumer version (Q1 2016). And the Oculus Rift recommended specs are:
NVIDIA GTX 970 / AMD 290 equivalent or greater
Intel i5-4590 equivalent or greater
8GB+ RAM
Not exactly breaking the bank. Could make a gaming PC for under $800 with all that they ask for.
The obvious way to push the resolutions and Hz up is to sacrifice detail quality. Less tris and so on.
There's also another way you can reap the benefits of high resolution and framerates without needing 2020's GPUs: video. Prerendered video is a lot easier on dedicated hardware than gaming with real time rendering. Which means that artistic VR and entertainment will be able to achieve 2160p120 before gaming at those resolutions and framerates is practical.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Thursday December 10 2015, @01:09PM
unless there is driver support that lasts, what else they do doesn't matter
(Score: 3, Informative) by takyon on Thursday December 10 2015, @01:44PM
http://www.anandtech.com/show/9811/amd-crimson-driver-overview [anandtech.com]
http://www.anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy [anandtech.com]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by Hyperturtle on Thursday December 10 2015, @09:37PM
As of this writing, the acronym "GCN" is not listed in the original posting of the article, nor in conversation. It is only in your quote and link.
What is GCN?
(It's the "Graphics Core Next" generation of hardware... starting with the previous generation that included the R9-2xx series).
I have two cards in R9 2xx generation and I had no idea what GCN was and had to look it up -- it never once came into my sphere of awareness when shopping for cards. It's a marketing term that I guess I just had as a blind spot; every game system since the Sega Genesis was marketed as "next gen" so if I were to take this at face value, they only recently dropped support for z-80 processors...
Anyway, if people intend to introduce an acronym not already part of the discussion -- can we at least be informed as to what it is if it doesn't directly relate to the topic at hand? Perhaps I am guilty of ignorance, but I still don't know what the OP's card was or even if GCN is related to it, as informative as it was. Nearly all of my cards lost support a while back; previous to the R9s I have, I had a 4850x2-- and that is still in use in an LGA771 conversion I did, and it still plays modern games pretty well despite not having a new driver in years.
(since I just said LGA771 without a description... that's the socket type for Intel core 2 servers, like Xeons, that can be used on an LGA775 motherboard with some physical modifications.)
(Score: 2) by takyon on Thursday December 10 2015, @11:34PM
I won't blame you for not knowing what GCN is, but I was merely adding the relevant information to the discussion, and the links were included so you can follow up if you need to know more.
I think the driver issue is a little precious. Once the (old) card works, it won't need too many driver updates.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Thursday December 10 2015, @02:08PM
Ten years ago I bought a laptop with an ATI Radeon Mobile and watched AMD stop making drivers for it inside of a year of owning it. I haven't bought an ATI product since. The drivers just plain suck even when they exist.
(Score: 2) by takyon on Thursday December 10 2015, @11:35PM
Cool anecdata. I mean, cool 10-year-old anecdata.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Friday December 11 2015, @08:40AM
I bought a used ATI AGP card about 1 month before xorg dropped support for it.
I was like: "..bbbut ATI is supposed to have better support than (S3) Unichrome!"
(Score: 2) by takyon on Friday December 11 2015, @11:11AM
ATI was acquired by AMD in 2006. Could that have led to your driver turmoil?
How long did the card work after support was dropped?
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Thursday December 10 2015, @06:37PM
Search for radeon in https://git.kernel.org/cgit/linux/kernel/git/stable/linux-stable.git/tree/firmware/WHENCE?id=HEAD [kernel.org]
ewww
And you though Linux was pure...