Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday November 09 2017, @12:11AM   Printer-friendly
from the getting-out-of-town dept.

The boss of AMD's Radeon Technologies Group is leaving the company:

Remember when we reported on the Radeon Technologies Group boss, Raja Koduri, taking a leave of absence with an intent to return to the fold in December? That isn't going to happen, according to a memo Raja has written to his team, because today is his last day in the job.

[...] Our sources tell us that Lisa Su, AMD CEO, will continue to oversee RTG for the foreseeable future. AMD appreciates that such an important role cannot be the sole domain of the CEO, and to this end is actively searching for a successor to Raja. We expect the appointment to be made within a few months.

The rumor mill suggests that Koduri will take a job at Intel, which would come at an interesting time now that Intel is including AMD graphics and High Bandwidth Memory in some of its products.

Update: Intel to Develop Discrete GPUs, Hires Raja Koduri as Chief Architect & Senior VP

Also at HotHardware and Fudzilla.

Previously: Interview With Raja Koduri, Head of the Radeon Technologies Group at AMD


Original Submission

Related Stories

Interview With Raja Koduri, Head of the Radeon Technologies Group at AMD 27 comments

In a VentureBeat interview with Raja Koduri, head of the Radeon Technologies Group at AMD, the company continues to advocate for virtual reality running at "16K resolution" at up to 240 Hz:

When Advanced Micro Devices created its own stand-alone graphics division, Radeon Technologies Group, and crafted a new brand, Polaris, for its upcoming graphics architecture, it was an admission of sorts. AMD championed the combination of processors and graphics into a single chip, dubbed the accelerated processing unit (APU). But the pendulum swung a little too far in that direction, away from stand-alone graphics. And now it's Raja Koduri's job to compensate for that.

I interviewed Koduri at the 2016 International CES, the big tech trade show in Las Vegas last week. He acknowledged that AMD intends to put graphics back in the center. And he said that 2016 will be a very big year for the company as it introduces its advanced FinFET manufacturing technology, which will result in much better performance per watt — or graphics that won't melt your computer. Koduri believes this technology will help AMD beat rivals such as Nvidia. AMD's new graphics chips will hit during the middle of 2016, Koduri said.

Beyond 2016, Koduri believes that graphics is going to get more and more amazing. Virtual reality is debuting, but we won't be completely satisfied with the imagery until we get 3D graphics that can support 16K screens, or at least 16 times more pixels on a screen that[sic] we have available on most TVs today. Koduri wants to pump those pixels at you at a rate of 240 hertz, or changing the pixels at a rate of 240 times per second. Only then will you really experience true immersion that you won't be able to tell apart from the real world. He calls it "mirror-like" graphics. That's pretty far out thinking.

AMD's "Polaris" GPUs will be released sometime during the summer of 2016. Along with AMD's "Zen" CPUs and APUs, Polaris GPUs will be built using a 14nm FinFET process, skipping the 20nm node.


Original Submission

Intel Announces Core H Laptop Chips With AMD Graphics and High Bandwidth Memory 21 comments

Intel squeezed an AMD graphics chip, RAM and CPU into one module

the new processor integrates a "semi-custom" AMD graphics chip and the second generation of Intel's "High Bandwidth Memory (HBM2)", which is comparable to GDDR5 in a traditional laptop.

Intel CPU and AMD GPU, together at last

Summary of Intel's news:

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD's Radeon Technologies Group* – all in a single processor package.

[...] At the heart of this new design is EMIB (Embedded Multi-Die Interconnect Bridge), a small intelligent bridge that allows heterogeneous silicon to quickly pass information in extremely close proximity. EMIB eliminates height impact as well as manufacturing and design complexities, enabling faster, more powerful and more efficient products in smaller sizes. This is the first consumer product that takes advantage of EMIB.

[...] Additionally, this solution is the first mobile PC to use HBM2, which consumes much less power and takes up less space compared to traditional discrete graphics-based designs using dedicated graphics memory, like GDDR5 memory.

takyon: This is more like an "integrated discrete GPU" than standard integrated graphics. It also avoids the need for Intel to license AMD's IP. AMD also needs to make a lot of parts since its wafer supply agreement with GlobalFoundries penalizes AMD if they buy less than a target number of wafers each year.

Also at AnandTech and Ars Technica.

Previously: AMD Stock Surges on Report of Intel Graphics Licensing Deal, 16-Core Ryzen Confirmed

Related: Samsung Increases Production of 8 GB High Bandwidth Memory 2.0 Stacks


Original Submission #1Original Submission #2

Intel Planning a Return to the Discrete GPU Market, Nvidia CEO Responds 15 comments

Intel isn't just poaching a prominent AMD employee. Intel is planning a return to the discrete GPU market:

On Monday, Intel announced that it had penned a deal with AMD to have the latter provide a discrete GPU to be integrated onto a future Intel SoC. On Tuesday, AMD announced that their chief GPU architect, Raja Koduri, was leaving the company. Now today the saga continues, as Intel is announcing that they have hired Raja Koduri to serve as their own GPU chief architect. And Raja's task will not be a small one; with his hire, Intel will be developing their own high-end discrete GPUs.

[...] [In] perhaps the only news that can outshine the fact that Raja Koduri is joining Intel, is what he will be doing for Intel. As part of today's revelation, Intel has announced that they are instituting a new top-to-bottom GPU strategy. At the bottom, the company wants to extend their existing iGPU market into new classes of edge devices, and while Intel doesn't go into much more detail than this, the fact that they use the term "edge" strongly implies that we're talking about IoT-class devices, where edge goes hand-in-hand with neural network inference. This is a field Intel already plays in to some extent with their Atom processors on the GPU side, and their Movidius neural compute engines on the dedicated silicon sign.

However in what's likely the most exciting part of this news for PC enthusiasts and the tech industry as a whole, is that in aiming at the top of the market, Intel will once again be going back into developing discrete GPUs. The company has tried this route twice before; once in the early days with the i740 in the late 90s, and again with the aborted Larrabee project in the late 2000s. However even though these efforts never panned out quite like Intel has hoped, the company has continued to develop their GPU architecture and GPU-like devices, the latter embodying the massive parallel compute focused Xeon Phi family.

Yet while Intel has GPU-like products for certain markets, the company doesn't have a proper GPU solution once you get beyond their existing GT4-class iGPUs, which are, roughly speaking, on par with $150 or so discrete GPUs. Which is to say that Intel doesn't have access to the midrange market or above with their iGPUs. With the hiring of Raja and Intel's new direction, the company is going to be expanding into full discrete GPUs for what the company calls "a broad range of computing segments."

Intel Discrete GPU Planned to be Released in 2020 8 comments

Intel's First (Modern) Discrete GPU Set For 2020

In a very short tweet posted to their Twitter feed yesterday, Intel revealed/confirmed the launch date for their first discrete GPU developed under the company's new dGPU initiative. The otherwise unnamed high-end GPU will be launching in 2020, a short two to two-and-a-half years from now.

[...] This new GPU would be the first GPU to come out of Intel's revitalized GPU efforts, which kicked into high gear at the end of 2017 with the hiring of former AMD and Apple GPU boss Raja Koduri. Intel of course is in the midst of watching sometimes-ally and sometimes-rival NVIDIA grow at a nearly absurd pace thanks to the machine learning boom, so Intel's third shot at dGPUs is ultimately an effort to establish themselves in a market for accelerators that is no longer niche but is increasingly splitting off customers who previously would have relied entirely on Intel CPUs.

[...] Intel isn't saying anything else about the GPU at this time. Though we do know from Intel's statements when they hired Koduri that they're starting with high-end GPUs, a fitting choice given the accelerator market Intel is going after. This GPU is almost certainly aimed at compute users first and foremost – especially if Intel adopts a bleeding edge-like strategy that AMD and NVIDIA have started to favor – but Intel's dGPU efforts are not entirely focused on professionals. Intel has also confirmed that they want to go after the gaming market as well, though what that would entail – and when – is another question entirely.

Previously: AMD's Radeon Technologies Group Boss Raja Koduri Leaves, Confirmed to be Defecting to Intel
Intel Planning a Return to the Discrete GPU Market, Nvidia CEO Responds


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0, Touché) by Anonymous Coward on Thursday November 09 2017, @12:37AM

    by Anonymous Coward on Thursday November 09 2017, @12:37AM (#594321)

    I have a bunch of trivia to care about first, but I could do this ... probably after Trump's last POTUS tweet. How's 2020 looking for you?

    Ooh, it's just possible it might be 2024 ... well, either way. Touch base with me then.

  • (Score: 1, Funny) by Anonymous Coward on Thursday November 09 2017, @12:58AM

    by Anonymous Coward on Thursday November 09 2017, @12:58AM (#594325)

    ...Zero Defects.

  • (Score: 2) by linkdude64 on Thursday November 09 2017, @04:39AM (2 children)

    by linkdude64 (5482) on Thursday November 09 2017, @04:39AM (#594428)

    Which I have been reading about recently:
    Differences in the self-reporting temperatures Vega GPUs report vs. what are measured, physically varying GPU package sizes depending on the model, making non-reference mass production of cards very difficult (Gigabyte, for instance, had said a month or two ago that they will not be producing a non reference card, which is huge), a now two-month delay for even the fully committed non-reference card manufacturers (ASUS, EVGA, etc) to release their cards, and Powercolor reporting that it is ready, but is "Waiting for its shipments of HBM modules" (?!?!?!) Something is clearly going very wrong with Vega.

    This is a surprise to me, as I had imagined the glut of income they should have had due to the mining boom would've given them every resource they needed to produce a worthy successor line. I will still refuse to support NVidia, but I had been waiting to buy a Vega card, and loathe the thought of paying full price for a card like a 580 which is just a rebranded architecture that is several years old already. Damn shame.

    • (Score: 1) by Booga1 on Thursday November 09 2017, @07:18PM (1 child)

      by Booga1 (6333) on Thursday November 09 2017, @07:18PM (#594777)

      Got a Vega 56 right at launch. I'm quite pleased with it since it's nowhere near as loud as the reviews seemed to be making out, and it's about 4x as powerful as my previous card.
      Can't say I'd want to pay the 85-100% premium for 50% more performance from a 1080 TI.

      • (Score: 2) by linkdude64 on Friday November 10 2017, @04:44PM

        by linkdude64 (5482) on Friday November 10 2017, @04:44PM (#595180)

        Well, that is good to know. I am very picky about hardware, however, and for GPUs will generally only buy ASUS custom cards due to their astounding build quality. Any blower card is going to be much louder than a decently made custom card, at any rate. I just know that Vega is quite power-hungry, especially compared to the Nvidia cards, and so heat dissipation becomes very important. Again, I would rather not buy any cards than buy Nvidia.

  • (Score: 2) by TheRaven on Thursday November 09 2017, @09:52AM (1 child)

    by TheRaven (270) on Thursday November 09 2017, @09:52AM (#594549) Journal
    Delays in posting the story mean that it comes after the announcement that he's definitely gone to Intel.
    --
    sudo mod me up
(1)