AMD emerged as a serious threat to Intel in servers more than a decade ago, but after a series of missteps and bad chips, the company's server business is hanging on by a thread.
Now, AMD is rebooting its server chip business with the upcoming Zen CPU, which will also be used in PCs. AMD is getting creative with Zen and considering merging the CPU with a high-performance GPU to create a mega-chip for high-performance tasks.
"It's fair to say we do believe we can combine a high-performance CPU with the high-performance GPU," AMD CEO Lisa Su said during an earnings call on Thursday.
Su's comment was in response to a question on whether AMD would ultimately combine its Zen CPU with a GPU based on the upcoming Vega architecture into one big chip for enterprise servers and supercomputing.
"Obviously, it'll come in time," Su said. "It's an area where combining the two technologies makes a lot of sense."
It wouldn't be the first time AMD has built a mega-chip. It has already combined full-featured CPUs and GPUs on made-to-order chips for the Xbox One and PlayStation 4 gaming consoles. The 5-billion transistor Xbox One chip uses an eight-core AMD CPU code-named Jaguar and a Radeon graphics processor.
GPUs are being used as co-processors in some of the world's fastest computers for tasks like weather modeling, economic forecasting, and weapons design. They are also used by Google in data centers for deep learning tasks. Nvidia has cornered the supercomputing space while AMD has struggled with its FirePro high-performance GPUs.
But AMD's integrated mega-chip would be unique. Nvidia has high-performance GPUs but lacks a CPU. Intel's CPUs dominate servers, but it does not offer a GPU. Some supercomputers combine Nvidia GPUs with CPUs from Intel or AMD.
-- submitted from IRC
(Score: 4, Insightful) by seeprime on Sunday July 24 2016, @05:01PM
After ten years of disappointing APU's, and FX CPU's with excessive power draw (over 200-watts), it's time for AMD to just release a great product. Many of us just don't believe them anymore.
(Score: 0) by Anonymous Coward on Sunday July 24 2016, @05:52PM
That reminds me, how to they intend to compete on power efficiency? Its really important in the market, and they seem to be behind intel and nvidia on that front.
(Score: 2) by takyon on Sunday July 24 2016, @06:19PM
Looking at the rumor mill, the 16-core Zen Opteron ("Snowy Owl") could have a less than 100 Watt TDP. This should also be one of the parts with graphics.
There also could be 4, 8, 12, 24, and 32 core variants [fudzilla.com]. I'm not sure which ones will have graphics.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2, Interesting) by Anonymous Coward on Sunday July 24 2016, @06:05PM
Used signed management engine code, just like Intel's ME, and both is known to have issues and provides ring -2/-3 level access to the the memory controller, ethernet devices, etc while not being auditable from the x86 side, I am not sure even a good performance chip would get me to buy AMD anymore.
They had a chance to differentiate themselves from Intel by being an open platform, but instead they chose to emulate every bad thing intel has done for the past 8+ years, including reneging on their support of coreboot with limited/no blogs. Anything post AM3 is suspect. (Some FM2 CPUs are apparently acceptable, but anything FAM15h+ is not.)
As a result AMD is getting no new business from me, same as Intel. A limited subset of ARM hardware is holding me over in the meantime, only due to having given up gaming thanks to Online DRM, Steam and 'Glorified PC' Consoles.
Hopefully the Talos workstation, or a RISC-V/J2+ 'reference' board will come out within the next year to provide us with an alternative and full open motherboard architecture to replace Wintel/AMD/ARM for those of us concerned with our freedom and our personal information security.
(Score: 0) by Anonymous Coward on Sunday July 24 2016, @06:07PM
used/use is/are and blogs/blobs
(Score: 0) by Anonymous Coward on Sunday July 24 2016, @10:42PM
See now if you had a cpu that looked after you, grammatical errors would be a bad memory only.
(Score: 0) by Anonymous Coward on Sunday July 24 2016, @06:35PM
It's also annoying that they disable the CPU video processing unit on their processors if you have a separate graphics card unless that graphics card is a Radeon. Only then can you use both. Otherwise the video processing unit on the CPU locks down and becomes unusable.
(Score: 0) by Anonymous Coward on Monday July 25 2016, @12:28AM
That bullshit is all thanks to sucking microsoft cock along with whoring themselves to the *AA johns. And Fuck intel and its garbage designs: ACPI, IME, EFI, etc.