Possible outcomes run the gamut from "more delays" to outright cancellation:
Almost a year ago, Intel made a big announcement about its push into the dedicated graphics business. Intel Arc would be the brand name for a new batch of gaming GPUs, pushing far beyond the company's previous efforts and competing directly with Nvidia's GeForce and AMD's Radeon GPUs.
Arc is the culmination of years of work, going back to at least 2017, when Intel poached AMD GPU architect Raja Koduri to run its own graphics division. And while Intel would be trying to break into an established and fiercely competitive market, it would benefit from the experience and gigantic install base that the company had cultivated with its integrated GPUs.
[...] The first Arc GPUs were initially targeted for early 2022, and Intel managed to announce a pair of low-end 300-series laptop GPUs at the tail end of March. To date, the number of those laptops that is actually available for purchase is relatively small, and no one in the US has been able to buy anything else. A desktop version of the 1080p-focused Arc A380 has appeared in China, though, and a few publications have managed to import and test it.
[...] Arc's performance is also worst when playing older games that don't support the DirectX12 or Vulkan APIs, pointing to one huge issue that Intel has openly acknowledged: The company is struggling with its GPU drivers.
[...] To its credit, Intel has openly acknowledged the problem that Arc has with pre-DirectX12 games, not just in slickly produced PR videos but by allowing its marketing team to conduct a charm offensive on popular tech YouTube channels like Linus Tech Tips and Gamers Nexus.
In LTT's case, this means jokes, fast edits, and sly winks to make viewers feel like they're getting secret, under-the-table information, even though Intel PR is standing over the channel's shoulder. The videos give Intel a way to own and partially defuse criticism of Arc's performance in older games. It's also a way to suggest that the company has nothing to hide.
As a PR strategy, it's great. It's the Domino's Pizza gambit: when your product's issues are impossible to ignore, you can build more trust and buy yourself a little goodwill and time by issuing loud, public mea culpas and owning the problem rather than ignoring it. And the tech-tubers have seemed receptive to Intel's framing—sure, performance in older games is all over the place, but it just means we're going to get great performance for the price in newer games.
[...] The next couple of years will be crucial for Intel's GPUs. Better drivers, aggressive pricing, and the Battlemage architecture could all help Intel find a foothold, establishing a third competitor in consumer and workstation GPUs and making the segment more competitive. Or Arc could end up going the way of Larrabee, a once-promising project that just didn't work out the way it was supposed to.
(Score: 3, Interesting) by richtopia on Thursday August 11 2022, @04:03PM (2 children)
The review's I've been seeing are all over the place. Nothing is promoting these GPUs as clear winners, but I've seen reviews ranging from "hot trash" to "competitively priced". My impression is they are reasonable if you use Resizable BAR (which I believe requires an Intel CPU) and a game/application with robust driver support. The driver problems can get fixed with further software efforts.
I'm cautiously optimistic. I'm excited for an AV1 encoder; if Twitch or Youtube add support for livestreaming with that codec it could really build a use-case for these GPUs, or perhaps the integrated GPUs on Intel CPUs in the future. I'm also curious how these GPUs will fair in computation applications.
(Score: 3, Interesting) by takyon on Thursday August 11 2022, @06:46PM (1 child)
Best case scenario, they age like FineWine™ [wccftech.com] with driver updates. The A380 in particular has 6 GB of VRAM rather than 4, and is potentially better than the 6500 XT.
A better way of saying it is that if you don't use Resizable BAR, you are fucked. Minimum framerates (1% lows) in particular take a big hit. [techspot.com] (ref [techspot.com])
There are rumors of a hardware flaw that can only be fixed with a refresh or new launch. Also that Intel is reviewing the discrete graphics division to determine whether or not to kill it. It may be a money pit. We just saw Intel kill Optane.
The competition should get AV1 hardware encode soon. AMD will have it with RDNA 3 GPUs, and apparently AV1 hardware encode will be included in the Ryzen 7000 desktop CPUs [notebookcheck.net] coming out in a couple of months:
I don't know if you can call it backporting, because it was probably always possible to separate the Video Core Next version from the graphics version. But it is a nice surprise.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by takyon on Thursday August 11 2022, @06:51PM
Just kidding, it's AV1 decode only:
https://www.techpowerup.com/review/amd-zen-4-ryzen-7000-technical-details/ [techpowerup.com]
I haven't heard anything about Nvidia and AV1 encode yet.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by drussell on Thursday August 11 2022, @04:13PM (1 child)
This is not the first time Intel has tried to play this game.
Last time, it did not go well.
(Score: 2) by driverless on Friday August 12 2022, @10:50AM
Every single time they've tried this it did not go well.
Every. Single. Time.
This is another one of those headlines that needs a "Surprising exactly no-one..." at the start.
(Score: 2) by Revek on Thursday August 11 2022, @04:44PM (1 child)
When these were announced I was doubtful. Nothing so far has given me any reason to believe intel might actually produce a decent GPU.
This page was generated by a Swarm of Roaming Elephants
(Score: 3, Funny) by DannyB on Thursday August 11 2022, @06:43PM
I am reminded of a once Intel processor which was nicknamed the iTanic.
How often should I have my memory checked? I used to know but...
(Score: 2) by RamiK on Thursday August 11 2022, @07:15PM (5 children)
Consoles... Smartphones... Tablets... Streamers.... SBCs... Software-defined networking... And now GPUs. You have to wonder when will Intel stop embodying the Peter principle and realize that if they keep playing it safe by entering markets if and only after they stabilized, they'll keep losing on price-point since commodity markets are best done in Asian fabs.
I mean, it's really amazing how their corporate structure can be so talent obsessed on the one hand, and so consistently risk-averse on the other. Like, even Microsoft somehow ended up doing the xBox due to some odd fluke... But Intel? Always launching too little, too late while repeatedly shrugging it off as "unexpected" market changes.
Gold to lead every time. Remarkable.
compiling...
(Score: 4, Informative) by takyon on Thursday August 11 2022, @07:34PM (4 children)
Intel's Alchemist GPUs were fabbed by TSMC, not Intel. There was nothing stopping them from offering a competitive alternative to Nvidia and AMD. There are Chinese GPUs emerging [videocardz.com], but they are irrelevant for now.
Intel has problems on the inside, internal miscommunications and rivalries for example. These are leading to delays and other problems. The delays for Alchemist helped it to miss a window of opportunity during which demand was so high, they could have launched GPUs with drivers just as bad as they are now, if not worse, and sold everything they made. Instead they are competing with much better options that have slipped below MSRPs, and next-gen around the corner.
It's hard to excuse them for bad drivers, since they have shipped hundreds of millions of iGPUs and Intel Xe DG1 [tomshardware.com], which was basically a beta run for Intel discrete graphics. Like Lakefield for Alder Lake.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by RamiK on Thursday August 11 2022, @08:58PM (2 children)
I'm aware. It doesn't matter. The point is that they're still late to enter a soon-to-be-made-in-China commodity market.
It's an in-house issue. It's about the timing and delays.
These "internal rivalries and miscommunications" are a chronic symptom of their risk-aversion that goes back at least to the Itanium's failure. They somehow managed to launch Core in time before AMD ate their cake mostly because AMD screwed up... But then the whole Transmeta and Atom derailments came along and the pattern just kept going since...
Face it, this is Intel and it's been Intel for some 20 years or so.
compiling...
(Score: 2) by takyon on Thursday August 11 2022, @09:44PM (1 child)
Intel still has too big to fail status with its ability to fab its own products, competitors' products, and spam CPUs into OEM desktops and laptops that now tend to be cheaper than AMD. We could see a cursed scenario in which AMD will be locked out of TSMC's best nodes following an invasion by China, and U.S. government funding will be poured into Intel, far beyond what they just got from the CHIPS Act.
Atom these days is pretty good, and I think the upcoming 8-core Alder Lake-N [tomshardware.com] Atom chips could be a game changer. By Atom derailment, I assume you mean the smartphone SoCs in particular. Which were made as late as October 2016, wow. Also, RIP Intel Quark [wikipedia.org].
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by RamiK on Friday August 12 2022, @09:35AM
Yeah we're on the same page. Specifically, the circa 2013 Silvermont 3rd gen chips were in particular bad shape and are still receiving occasional platform patches to this day: https://lore.kernel.org/lkml/063d7fdb-2d4c-5798-773b-d82b4f0e918a@redhat.com/ [kernel.org]
But they never really got stable until around the 2017 Goldmont Plus and it was entirely coincidental since Intel got stuck on 14nm 3 times longer than expected (almost 6years instead of the usual 2-3 per-node) so the microarch had a chance to play some catch up thanks to Google's Chromebooks.
Tremont (and Gracemont) are something else entirely. Like, on the surface they look like Atoms due to the power budget and cache sizes... But if you look at the fetch and decoder pipelines it's pretty clear the microarch was done for low-power software-defined networking and laptops. Technically they sell... But practically speaking between their price point and how they're not scaling into the markets where ARM operates with similar archs... Well, it really more of the same.
Offering 8 cores at 10W TDP is just barely countering Apple's M1 and Qualcomm 800-1000 to retain their position at the "ultramobile" Microsoft Surface, Chromebooks and microPC SoCs segments. To be a "game changer", they'd need to actually expand into newer markets so their fab will scale to different yields. Like, use that ARM license of theirs and compete against TSMC and MediaTek at the 4-6W TDP where MediaTek Dimensity 9000+ is operating. Or at least work to get Android rock solid on x86 so they'll be able to use their existing 6-10W TDP mobile SoCs to compete against Qualcomm... But they're doing neither.
Atoms then and GPUs now, Intel entered the game too little too late. As for the future, their idea of investing in RISC-V is to put money into HPC so I'm pretty sure I know what's the next "unexpected" is going to be...
compiling...
(Score: 2) by driverless on Friday August 12 2022, @11:48AM
That's by InnoSilicon, they make mining rigs, not graphics cards. The "GPU" is for mining, not rendering images, even if they're peddling them as graphics cards.
(Score: 2) by Freeman on Friday August 12 2022, @03:15PM
I've watched some Linus Tech Tips videos and they did do a "Wan Show" where they had people from Intel live. The show wasn't a "rubber stamp of approval" from Linus. In subsequent "Wan Shows" they commented on the rumors, etc. They are taking the "wait and see" approach, which seems reasonable. Without the hardware in your hands, you're just speculating on it's capabilities. Still, I'd definitely be waiting for a review of released hardware, before I thought about buying an Intel GPU. Price is a very big consideration as well. In the event that it's priced similarly to AMD/Nvidia, but has a bunch of problems. That would be a big problem. In the event that it's priced significantly lower, has some downsides, but could get better with driver updates. Or even, if it can't due to hardware issues. It could still be a step in the right direction. I.E. Intel being a 3rd player in the GPU wars.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"