Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 11 submissions in the queue.
posted by hubie on Wednesday October 09 2024, @02:55PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

The data compiled by Jon Peddie Research (JPR) reveals a significant surge in global AIB [add-in board] volumes, up 47.9 percent year-on-year to 9.5 million units and up 9.4 percent quarter-over-quarter from 8.7 million.

Yet since Intel introduced its first dedicated AIB – or graphics card – via the Arc Alchemist microarchitecture in March 2022, the company has seemingly failed to capture meaningful market share from either Nvidia or AMD, at least according to JPR.

[...] When Intel first teased its Arc GPUs, there was a lot of buzz. Could Chipzilla translate its experience in processors to AIBs and perform as well in the dedicated graphics market as it has elsewhere?

On launch, the company talked a big game about disrupting the duopoly of Nvidia and AMD. Intel promised its products would be affordable and competitive, with options for gamers, creators, and enterprise users. Just over two years in, the reality hasn't lived up to the hype. Intel has suffered some technical setbacks, including driver instability and immaturity, which is a given for a new player in the market. The other stumbling block is performance related, although Intel has consistently released new driver updates looking to address this.

From here, Intel's movement into the AIB market seems to have been a dud, particularly considering the company's poor financial position and rivals expressing interest in acquiring assets. If Intel can't even dent a full percentile of AMD's market share, it seemingly doesn't stand a chance.

Unless Intel can recapture some of that earlier buzz with the upcoming Battlemage AIBs between now and the end of 2025, its goal of being a major player in dedicated graphics appears more likely to be a pipe dream.

Intel needs to focus on its pedigree in microprocessors rather than trying to enter a market locked down by Nvidia because the issues around its 14th and 13th gen Core series families haven't done its reputation any favors. Nvidia's dominance in the broader graphics market looks unlikely to change as we enter the age of AI, nor will its chokehold on the AIB industry, at least not any time soon.


Original Submission

Related Stories

Intel Arc B580 Review: A $249 RTX 4060 Killer, One-and-a-Half Years Later 2 comments

https://arstechnica.com/gadgets/2024/12/review-intel-arc-b580-is-a-compelling-if-incredibly-tardy-250-midrange-gpu/

After much anticipation, many delays, and an anticipatory apology tour for its software quality, Intel launched its first Arc GPUs at the end of 2022. There were things to like about the A770 and A750, but buggy drivers, poor performance in older games, and relatively high power use made them difficult to recommend. They were more notable as curiosities than as consumer graphics cards.
[...]
The new Arc B580 card, the first dedicated GPU based on the new "Battlemage" architecture, launches into the exact same "sub-$300 value-for-money" graphics card segment that the A770 and A750 are already stuck in. But it's a major improvement over those cards in just about every way, and Intel has gone a long way toward fixing drivers and other issues that plagued the first Arc cards at launch. If nothing else, the B580 suggests that Intel has some staying power and that the B700-series GPUs could be genuinely exciting if Intel can get one out relatively soon.
[...]
As with the Arc A-series cards, Intel emphatically recommends that resizable BAR be enabled for your motherboard to get optimal performance. This is sometimes called Smart Access Memory or SAM, depending on your board; most AMD AM4 and 8th-gen Intel Core systems should support it after a BIOS update, and newer PCs should mostly have it on by default. Our test system had it enabled for the B580 and for all the other GPUs we tested.
[...]
Intel is explicitly targeting Nvidia's GeForce RTX 4060 with the Arc B580, a role it fills well for a low starting price. But the B580 is perhaps more damaging to AMD, which positions both of its 7600-series cards (and the remaining 6600-series stuff that's hanging around) in the same cheaper-than-Nvidia-with-caveats niche.
[...]
All of that said, Intel is putting out a great competitor to the RTX 4060 and RX 7600 a year and a half after those cards both launched—and within just a few months of a possible RTX 5060. Intel is selling mid-2023's midrange GPU performance in late 2024. There are actually good arguments for building a budget gaming PC right this minute, before potential Trump-administration tariffs can affect prices or supply chains, but assuming the tech industry can maintain its normal patterns, it would be smartest to wait and see what Nvidia does next.

Related articles on SoylentNews:
Intel Entrance To Graphics Card Market Has Failed - 20241008
Intel's GPU Drivers Now Collect Telemetry, Including 'How You Use Your Computer' - 20230818
Getting AAA Games Working in Linux Sometimes Requires Concealing Your GPU - 20230811
Rumors, Delays, and Early Testing Suggest Intel's Arc GPUs are on Shaky Ground - 20220810
Intel Arc GPUs Could Give Gamers a Reason to Drop Windows 11 for Linux - 20220202
Intel Plans to Launch High-End "Arc" GPUs in Q1 2022 - 20210817


Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Informative) by aafcac on Wednesday October 09 2024, @03:00PM (4 children)

    by aafcac (17646) on Wednesday October 09 2024, @03:00PM (#1376347)

    Is anybody even the slightest bit surprised here? I think the last time that Intel made a graphics card that people actually liked was back when 2d accelerators were the norm. I personally had one of those horrible i740s for a bit back in the '90s and I'm not sure that I'll ever live to be old enough to pay money for an Intel GPU as a result. The thing was bordering on fraud being marketed as having 3d acceleration capabilities.

    • (Score: 2) by driverless on Thursday October 10 2024, @07:17AM (3 children)

      by driverless (4770) on Thursday October 10 2024, @07:17AM (#1376417)

      Yet another story that needs the "Surprising exactly nobody" thumbnail. Intel has been failing at graphics since the 82786 nearly 40 years ago. The only reason anyone ever used Intel graphics was because it was baked into their system and they had no other choice.

      • (Score: 2) by janrinok on Thursday October 10 2024, @08:34AM (2 children)

        by janrinok (52) Subscriber Badge on Thursday October 10 2024, @08:34AM (#1376423) Journal

        Nevertheless, Intel repeatedly believe that 'this time' they have something that will have an impact on the market. Whether they are being overly optimistic, or the company's technical abilities do not match their dreams, or it is just too little too late I do not know. They are presumably not trying to fail so perhaps one day they will have a success that surprises us and maybe even surprises them.

        It doesn't look like this it but they might still pull something out of the bag one day.

        --
        [nostyle RIP 06 May 2025]
        • (Score: 3, Insightful) by driverless on Thursday October 10 2024, @08:44AM (1 child)

          by driverless (4770) on Thursday October 10 2024, @08:44AM (#1376425)

          Problem is that their competitors nVidia and AMD/ATI have a 25-year lead on how to do it. They've been evolving and tuning their tech for about a quarter of a century, while Intel has to start each time with a new, high-risk unproven alternative because they can't just try and play catch-up with nVidia and AMD's existing tech. And it's not just the hardware, the two big players have also spent that time evolving DirectX/OpenGL/Vulkan/whatever in parallel with the hardwware, so to be compatible with that without getting into an nVidia/AMD catchup Intel has to jump through all sorts of hoops or nobble whatever magic hardware they've dreamed up to work with the existing software way of doing things.

          • (Score: 2) by aafcac on Thursday October 10 2024, @07:42PM

            by aafcac (17646) on Thursday October 10 2024, @07:42PM (#1376482)

            If they stuck with it or focused on specific niche use like in laptops, they might eventually make progress, but they don't have the option of buying a GPU company and they don't have the patience. Even just throwing a bunch into processor extensions might help.

  • (Score: 5, Insightful) by DadaDoofy on Wednesday October 09 2024, @03:42PM (1 child)

    by DadaDoofy (23827) on Wednesday October 09 2024, @03:42PM (#1376350)

    Intel is an old top-heavy company like Boeing. It survives only because of the brand's past glory. Both companies are run by bean counters focused on the quarterly results Wall Street demands, rather than any sort excellence in their products. They are on borrowed time.

    • (Score: 4, Insightful) by RamiK on Wednesday October 09 2024, @06:47PM

      by RamiK (1813) on Wednesday October 09 2024, @06:47PM (#1376364)

      It survives only because of the brand's past glory

      Intel survives because it's heavily subsidized.

      rather than any sort excellence in their products. They are on borrowed time.

      They have a lot of good engineers and generational industry knowledge that simply can't be found anywhere else in the US. However, now that Apple manufactures some of its chips in the US, I'm guessing it's probably viable to at least gradually split apart Intel across the fab-fabless lines.

      --
      compiling...
  • (Score: 2) by looorg on Wednesday October 09 2024, @05:36PM

    by looorg (578) on Wednesday October 09 2024, @05:36PM (#1376359)

    That would explain why they are trying to sell them with large discounts, like hundred+ $ below the once I assume somewhat similar products from Nvidia. I have been almost tempted to get one.

  • (Score: 3, Interesting) by bzipitidoo on Wednesday October 09 2024, @05:51PM (2 children)

    by bzipitidoo (4388) on Wednesday October 09 2024, @05:51PM (#1376360) Journal

    Firstly, Intel claims that the instability problems of their 13th and 14th gen CPUs are fixed [tomshardware.com]. I hope Intel is correct about this. And honest. They haven't always been honest. Very poor handling of the FDIV bug in the first Pentiums in the early 1990s. There used to be a website, faceintel.com, that was very critical of Intel's treatment of employees.

    Intel survived all that quite handily. AMD beat Intel to the punch with a design for a 64bit x86 architecture, and Intel weathered that too. Just a few years after AMD64, it was AMD that was falling behind while Intel tick-tocked further and further ahead. AMD finally released Ryzen, catching up.

    But now, to hear that Intel is struggling? What? I find this hard to believe. Yes, their early integrated graphics, such as the i845 stuff, were horrifically slow. They've gotten a lot better. I can't understand why Intel is so bad at graphics.

    • (Score: 3, Interesting) by aafcac on Wednesday October 09 2024, @09:17PM

      by aafcac (17646) on Wednesday October 09 2024, @09:17PM (#1376384)

      I'm not sure that "weathering" is how I'd describe engaged in blatant antitrust violations so egregious that regulators actually stepped in. And then again a few years later when AMD had caught up again.

      Intel exists mainly because they can pump out the chips fast enough to take up most of the demand. If they actually had to compete on merits without being able to commit antitrust violations, I'm not sure they'd still be in business. And with more and more computing being done on mobile devices that can't deal with any of Intel's offerings, I do wonder if they're going to still be in business in 20 years.

    • (Score: 2) by Reziac on Thursday October 10 2024, @04:21AM

      by Reziac (2489) on Thursday October 10 2024, @04:21AM (#1376408) Homepage

      Over in yonder junk box I have one of those early AMD64 CPUs. It cannot run a 64bit OS, only x86. AMD had a similar problem back in the olden days, a 32 bit CPU that would only support a 16bit bus.

      Just in case you wondered why a 32bit version of Vista shipped on HP's concurrent foray into AMD64 CPUs.....

      --
      And there is no Alkibiades to come back and save us from ourselves.
(1)