Arthur T Knackerbracket has processed the following story:
Nvidia has been riding high thanks to AI, the current center of attention in the tech industry. The chipmaker's silicon is among the only such hardware that can provide the necessary processing power to enable resource-intensive commercial AI models.
Because of this, Nvidia has been sort of a bellwether for the AI industry — rising sky high as an indication of AI's extreme rate of growth.
Over the past week, however, Nvidia has taken a huge tumble on the stock market, and has lost around $500 billion in value.
[...] Nvidia is still doing just fine of course. And it's likely to still be raking in plenty of revenue thanks to AI-related patronage from the likes of Elon Musk, who is reportedly building an Nvidia-based "supercomputer" via his AI company xAI. However, this recent downturn on the stock market might show that investors are sending a message that they're not so bullish on the AI industry's monumental claims of how they will change the world with their technology.
As multiple outlets have reported, AI companies have made big promises, but so far have had very little to show for it when it comes to actual meaningful change in the industry's AI claimed to soon disrupt. On top of that, studies have found AI to be a massive energy and resource drain, which will certainly give at least some AI backers second thoughts about where the industry is headed.
(Score: 2) by ikanreed on Sunday June 30 2024, @04:09AM (3 children)
I expected the bubble to go another year.
(Score: 2) by ls671 on Sunday June 30 2024, @05:33AM (2 children)
Rising stock price of a company always goes overpriced then, a correction occurs. Correction might indeed have come a little sooner with all the recent reports of stupid things AI spewed especially this week about the presidential debate.
Everything I write is lies, including this sentence.
(Score: 3, Insightful) by driverless on Sunday June 30 2024, @11:07AM (1 child)
Which is why the title is really saying "nVidia loses fantasy number pissing match". As it should for any company whose fantasy number is highest this week or month.
(Score: 0) by Anonymous Coward on Sunday June 30 2024, @06:23PM
Naah. I bought a bunch of shares around $90/share, bought a bunch more around $300/share. When it peaked around $140/share I sold a bunch. The feds will rape me for their "fair share" of the reward on taking a risk, but now I've recouped the money I initially invested. The rest is gravy.
Now I'm sitting on hundreds of shares and ~$30k in pretend money, and that's after nVidia lost a shit ton of value...it's still trading above the split price.
If your company is valued at $1,000, how much do you lose by dropping 50%? $500. Not a big number. How much do you lose if your company is valued at $1,000,000,000,0,0,00,0,000,00,,0,000 and you lose 50%? More than the GDP of some nations...but you're still filthy rich.
(Score: 3, Insightful) by DrkShadow on Sunday June 30 2024, @04:32AM (2 children)
That's the sum of how much stock the executives sold recently?
Not really surprising.
Alternatively, you could say that it's the same group of people propping all of these stocks up, and when a large-ish entity decides to sell a lot -- there aren't that many
suckinvestors to absorb the hit. So price goes (way) down, disproportionately down. Way quicker than if it had a more broad supporting base. I guess that says how many people buy into AI as a good bet, how "fad" it is.Now hopefully AI hasn't been baked into CPU instruction sets so hard that we can't about-face on those instructions. sigh.
(Score: 3, Interesting) by aafcac on Sunday June 30 2024, @03:12PM (1 child)
Between that and companies being allowed to buy their own stocks back, it's well past time that we stop using the market cap for really anything at all. If we want to rank the largest companies, it ought to be based on things that are more relevant to the world, like their assets, sales and profitability. It's one of the reasons why the Dow Jones Industrial Average is such a stupid index.
(Score: 1) by khallow on Wednesday July 03 2024, @02:52AM
Do you use that market cap statistic for anything? I suspect this is an already solved problem.
(Score: 2) by RamiK on Sunday June 30 2024, @08:35AM (4 children)
Scalable MatMul-free Language Modeling [arxiv.org]
MatMul-Free LM [github.com]
compiling...
(Score: 2) by Rich on Sunday June 30 2024, @12:54PM (3 children)
ARM Macbooks can run local inference (e.g. Stable Diffusion) rather well. However, the gap between CPU and GPU processing for that seems to be rather small - and memory bandwidth limited. Which is something the TFA's paper abstract also takes into consideration. I lack the full understanding of what exactly goes on all the way down to the metal, but given that the inputs need to be inflected at all the parameters, which reside in memory, I can well imagine that memory speed is more important than piling on another magnitude of computing units (unless these units individually get all the local storage they need to finish their tasks).
(Score: 2) by RamiK on Sunday June 30 2024, @01:36PM (2 children)
Macbooks use NPUs for inference: https://github.com/hollance/neural-engine/blob/master/docs/supported-devices.md [github.com]
Nvidia's GPUs and MatMul-Free LM is for training.
compiling...
(Score: 3, Informative) by Rich on Sunday June 30 2024, @09:57PM (1 child)
The stuff that we get to tinker with (i.e. PyTorch) doesn't use the NPU so far. You can select either "CPU" or "MPS", for "Metal Performance Shaders", which means the GPU. It's entirely possible to train on that setup on a small scale, whereas some commenter said that the NPU's precision is not suitable for training (at least as we know it). I've seen a benchmark where the NPU outperformed CPU or GPU by a factor over 10, possibly done on top of CoreML (which is Apple's proprietary foundation), but somehow I can't imagine that large model inference would be sped up by that factor. If it did, putting Stable Diffusion on CoreML (cf. https://machinelearning.apple.com/research/stable-diffusion-coreml-apple-silicon [apple.com]) would outrun the RTX4090. As I said, I have no idea of what bits in the full stack go where at what time, so I can't make sensible conclusions.
(Score: 4, Interesting) by RamiK on Sunday June 30 2024, @11:42PM
In some cases it's possible (and therefore, advantageous) to train on the NPUs: https://docs.nvidia.com/deeplearning/performance/mixed-precision-training/index.html#faq-general__section_yyq_ppq_cjb [nvidia.com]
The problem is that the tradeoff between 2-5x improved performance at the expense of accuracy rarely makes sense for singular expenses like model training.
It depends on the model parameters but you should be able to offload pytorch models on Apple's NPU by converting them with coremltools: https://apple.github.io/coremltools/docs-guides/source/convert-pytorch.html [github.io]
What you saw probably compared energy usage of shader cores vs. npu in general as the 4090 also has 512 tensor cores for half/mixed-precision floating point and outperforms anything integrated (into the SoC) anyone has to offer:
M3: https://browser.geekbench.com/ml/v0/inference/402129 [geekbench.com]
4090: https://browser.geekbench.com/ml/v0/inference/394717 [geekbench.com]
compiling...
(Score: 3, Insightful) by Ingar on Sunday June 30 2024, @09:55AM
It was never there to begin with.
Love is a three-edged sword: heart, mind, and reality.
(Score: 2) by RedGreen on Sunday June 30 2024, @11:21AM
"However, this recent downturn on the stock market might show that investors are sending a message that they're not so bullish on the AI industry's monumental claims of how they will change the world with their technology."
It just might show the current pump and dump the gamblers on Wall Street engage in at all times is over. The time to fleece the rubes has begun only to rinsed then repeated for as long as the greedy suckers will believe the bullshit to keep giving them their money. I have seen it happen so many times in my life, the cycle is endless.
Those people are not attacking Tesla dealerships. They are tourists showing love. I learned that on Jan. 6, 2021.
(Score: 0, Troll) by Runaway1956 on Sunday June 30 2024, @11:33AM (1 child)
Does this mean that we won't see AI in every headline? God, I hate buzzwords, and I'm sure most Soylentils hate them just as much. "Journalists" of whatever caliber seem to rely on them though.
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 3, Touché) by Tork on Sunday June 30 2024, @07:57PM
How'd you get this far following the tech industry and only now start gettin' sick of buzzwords and journalists that are wrapped in quotes? Didja block out "e-commerce"... like I did?
🏳️🌈 Proud Ally 🏳️🌈
(Score: 5, Insightful) by stormwyrm on Sunday June 30 2024, @12:08PM
Is the technology behind AI useful? Sure, it could be. But is it $3 trillion useful? Extremely unlikely. I've not seen such insane valuations since the dot-com era. Something useful is likely to come out of all this in the end just as with the Internet in the late 1990s, but there will probably be a bloodbath at least as brutal when the bubble finally pops before we get there.
Numquam ponenda est pluralitas sine necessitate.
(Score: 4, Informative) by gnuman on Sunday June 30 2024, @01:13PM
As a software dev, I'm now what? 15 months into being replaced .... so far I've seen no improvements past copy-pasta stackoverflow "AI developer".