Arthur T Knackerbracket has processed the following story:
The data compiled by Jon Peddie Research (JPR) reveals a significant surge in global AIB [add-in board] volumes, up 47.9 percent year-on-year to 9.5 million units and up 9.4 percent quarter-over-quarter from 8.7 million.
Yet since Intel introduced its first dedicated AIB – or graphics card – via the Arc Alchemist microarchitecture in March 2022, the company has seemingly failed to capture meaningful market share from either Nvidia or AMD, at least according to JPR.
[...] When Intel first teased its Arc GPUs, there was a lot of buzz. Could Chipzilla translate its experience in processors to AIBs and perform as well in the dedicated graphics market as it has elsewhere?
On launch, the company talked a big game about disrupting the duopoly of Nvidia and AMD. Intel promised its products would be affordable and competitive, with options for gamers, creators, and enterprise users. Just over two years in, the reality hasn't lived up to the hype. Intel has suffered some technical setbacks, including driver instability and immaturity, which is a given for a new player in the market. The other stumbling block is performance related, although Intel has consistently released new driver updates looking to address this.
From here, Intel's movement into the AIB market seems to have been a dud, particularly considering the company's poor financial position and rivals expressing interest in acquiring assets. If Intel can't even dent a full percentile of AMD's market share, it seemingly doesn't stand a chance.
Unless Intel can recapture some of that earlier buzz with the upcoming Battlemage AIBs between now and the end of 2025, its goal of being a major player in dedicated graphics appears more likely to be a pipe dream.
Intel needs to focus on its pedigree in microprocessors rather than trying to enter a market locked down by Nvidia because the issues around its 14th and 13th gen Core series families haven't done its reputation any favors. Nvidia's dominance in the broader graphics market looks unlikely to change as we enter the age of AI, nor will its chokehold on the AIB industry, at least not any time soon.
(Score: 2) by driverless on Thursday October 10, @07:17AM (3 children)
Yet another story that needs the "Surprising exactly nobody" thumbnail. Intel has been failing at graphics since the 82786 nearly 40 years ago. The only reason anyone ever used Intel graphics was because it was baked into their system and they had no other choice.
(Score: 2) by janrinok on Thursday October 10, @08:34AM (2 children)
Nevertheless, Intel repeatedly believe that 'this time' they have something that will have an impact on the market. Whether they are being overly optimistic, or the company's technical abilities do not match their dreams, or it is just too little too late I do not know. They are presumably not trying to fail so perhaps one day they will have a success that surprises us and maybe even surprises them.
It doesn't look like this it but they might still pull something out of the bag one day.
I am not interested in knowing who people are or where they live. My interest starts and stops at our servers.
(Score: 3, Insightful) by driverless on Thursday October 10, @08:44AM (1 child)
Problem is that their competitors nVidia and AMD/ATI have a 25-year lead on how to do it. They've been evolving and tuning their tech for about a quarter of a century, while Intel has to start each time with a new, high-risk unproven alternative because they can't just try and play catch-up with nVidia and AMD's existing tech. And it's not just the hardware, the two big players have also spent that time evolving DirectX/OpenGL/Vulkan/whatever in parallel with the hardwware, so to be compatible with that without getting into an nVidia/AMD catchup Intel has to jump through all sorts of hoops or nobble whatever magic hardware they've dreamed up to work with the existing software way of doing things.
(Score: 2) by aafcac on Thursday October 10, @07:42PM
If they stuck with it or focused on specific niche use like in laptops, they might eventually make progress, but they don't have the option of buying a GPU company and they don't have the patience. Even just throwing a bunch into processor extensions might help.