An interesting article on the economics of AI Chips by Mihir Kshirsagar
This week, Open AI announced a multibillion-dollar deal with Broadcom to develop custom AI chips for data centers projected to consume 10 gigawatts of power. This investment is separate from another multibillion-dollar deal OpenAI struck with AMD last week. There is no question that we are in the midst of making one of the largest industrial infrastructure bets in United States history. Eight major companies—Microsoft, Amazon, Google, Meta, Oracle, OpenAI, and others—are expected to invest over $300 billion in AI infrastructure in 2025 alone. Spurred by news about the vendor-financed structure of the AMD investment and a conversation with my colleague Arvind Narayanan, I started to investigate the unit economics of the industry from a competition perspective.
What I have found so far is surprising. It appears that we're making important decisions about who gets to compete in AI based on financial assumptions that may be systematically overstating the long-run sustainability of the industry by a factor of two. That said, I am open to being wrong in my analysis and welcome corrections as I write these thoughts up in an academic article with my colleague Felix Chen.
Here is the puzzle: the chips at the heart of the infrastructure buildout have a useful lifespan of one to three years due to rapid technological obsolescence and physical wear, but companies depreciate them over five to six years. In other words, they spread out the cost of their massive capital investments over a longer period than the facts warrant—what The Economist has referred to as the "$4trn accounting puzzle at the heart of the AI cloud."
Center for Information Technology Policy (Princeton University)
(Score: 3, Touché) by JoeMerchant on Tuesday October 21, @11:08PM (13 children)
The analog tech you refer to is akin to flint and kindling, it doesn't scale well, is very expensive to get significant number of FLOPS.
The digital tech you attempt to tout as superior to analog due to its "digitalness" is billions of times more capable of rendering double precision float calculations per dollar than the analog tech. Cheap like dropping stale gasoline from a cargo plane.
The point is that digital isn't superior because of its low end tendency to view the world in binary, it's superior because it has developed grains of beach sand into highly organized micro-computing machines, capable of more dynamic range, bandwidth and lower noise floor than the latest iteration of analogue systems.
🌻🌻🌻🌻 [google.com]
(Score: 2, Insightful) by khallow on Wednesday October 22, @02:11AM (12 children)
The use of "digital" here was as an accurate label to describe the technology (especially as compared to the label "analogue").
The binary nature of the operations is a big part of the reason that it is cheap and faster as well as these other advantages.
(Score: 2) by JoeMerchant on Wednesday October 22, @02:37PM (11 children)
>Analogue perceptrons haven't been competitive with digital competitors for 70 years.
I took that, and your later statements, to be a bash of analogue methods as compared to digital, even binary based, ones.
It fits your persona, just not the real world I live in.
🌻🌻🌻🌻 [google.com]
(Score: 1) by khallow on Thursday October 23, @12:06PM (10 children)
Taking an accurate, truthful, and fully relevant statement as indicating a defective state of mind?
"It fits your persona." In other words, you just acknowledged that you weren't in your "real world I live in" at the time you "took that".
This goes back to your erroneous beliefs about perception: "Perception is all there is." [soylentnews.org] The problem here is that you could have perceived better before you even started your first post. But that would require you to think first rather than just run off of an erroneous non-perception-based mental model that you had going in your head at the time.
(Score: 2) by JoeMerchant on Thursday October 23, @12:48PM (9 children)
> indicating a defective state of mind?
No, the defective state of mind observation is more of a holistic thing, built over years.
> Perception is all there is.
That will never change. No matter how good of a handle you think you have on the truth, all you know comes from your limited personal experience - be that post-birth programming of your neural network, or pre-birth encoding in your DNA, all of that was "learned" on this tiny little mud ball which is an unfathomably small speck within an unremarkable and insignificant galaxy in the universe that we, as a species, just started to observe barely a century ago.
Being stuck in middle of nowhere North America is similarly limiting, even with Fox news to inform you about the outside world.
🌻🌻🌻🌻 [google.com]
(Score: 1) by khallow on Friday October 24, @04:48AM (8 children)
I don't use Fox News as a primary source. Back in 2001, I happened to watch Fox News in action on TV. They had red, white, and blue colors up. A very patriotic gimmick. I decided that silliness wasn't for me.
As you've noted, you live in rural Florida. What makes my isolation more significant than your equivalent isolation? Hypocrisy is a common shoal that ad hominems founder on.
But you could have figured that out from the sources I actually use in my many discussions on the internet. That you failed to is yet another example of your "holistic" failings over the years.
Just in this thread, we have quite the collection of fallacies: confirmation/observation bias, fallacy of the stolen concept, and straw man fallacies.
(Score: 2) by JoeMerchant on Friday October 24, @06:14PM (7 children)
At the risk of breaking the fourth wall, I hope it's obvious that I enjoy our little sparring sessions and I don't think you drink (solely) from the Fox firehose. That's not an insult I want to throw your way. The point was, rather, that an awful lot of people in places less remote than MF, ID do have such a biased perspective. Want an example, just look at how Brexit was pulled off, or the Spanish American War, or any US Presidential election in the past 20 years.
I have lived all over the Florida panhandle, mostly in metro areas, though we did have a plan to move to a rural area and owned (but only occasionally camped on) 20 rural acres on a river for nearly 20 years while we were deluded that that might be a desirable thing to build a home there for us and our children. I spent 20 years in Miami, and have been in another major metro area for the last 12 now. The Florida town I grew up in wasn't a "major metro area" when I lived there, but it may as well be one now - population doubled there while I was in college, and it has more than tripled again since then, and it was far beyond "a small town" even when I was born there.
I'm not a significant world traveler, though I have been to various countries in Europe more times than I can immediately remember, less than a dozen, more than six... and a few other scattered trips to different places, but in my opinion it's not the number of times you have cleared customs, but more what you do while you are there, and how you do it, that really determines the value of the "experience". I certainly could learn more about the world firsthand, but I feel that I have done enough travel to diverse countries with reasonably deep exposure to their cultures to have a reasonable "feel" for information that comes to me through the internet, video documentaries and other sources - what can be trusted (none of it 100%), and what's clearly promotional material by one or more sources (most of it) - even with that cynical view, there is a tremendous amount of credible information to glean from these modern sources - much more than word of mouth from fellow travelers and dusty books provided centuries ago.
--- Fourth wall replaced ---
>confirmation/observation bias, fallacy of the stolen concept, and straw man fallacies.
Take your little "what's your fallacy" website and stick it, you know where. Just like the clowns on the news, you're twice as guilty of every fallacy you are accusing others of.
🌻🌻🌻🌻 [google.com]
(Score: 2, Touché) by khallow on Sunday October 26, @03:09AM (6 children)
Sorry dude, but widespread use of fallacy is one of the strongest pieces of evidence that you aren't thinking. I already pointed to multiple instances just in this thread. Meanwhile all you have is an empty assertion that i do it "twice as guiltily". So let's review:
Really, read up on these things so that you don't do them. The only place I need to stick these lists of fallacies is in your head. But I need your help for that.
(Score: 2) by JoeMerchant on Sunday October 26, @03:30PM (5 children)
Just because I give you a Touche' mod doesn't mean I agree with you.
🌻🌻🌻🌻 [google.com]
(Score: 1) by khallow on Sunday October 26, @07:35PM (4 children)
Sure. Thanks for the mod just the same.
My point about the fallacies is that it's evidence that either the author isn't thinking or they hope that their audience isn't thinking! There is no reason for use of fallacies that is both rational and good faith. They hide truth.
(Score: 2) by JoeMerchant on Sunday October 26, @08:12PM (3 children)
My distaste for the "what's your fallacy" website of 10-15 years ago IIRC is that so many people inappropriately stretch the definitions to fit arguments they don't like, arguments presented in good faith with good backing and logic, but because there's a fallacy on the list that looks a little like it might apply, oh here we go and now we're arguing about whether the fallacy applies or not. You'll rarely get an argument back from me about one of those fallacies you claim applies to what I write.
Another thing: these aren't Hemingway novels I'm writing for you here, they're off the cuff - maybe lightly researched if I'm curious about the specifics of the topic for myself.
🌻🌻🌻🌻 [google.com]
(Score: 1) by khallow on Sunday October 26, @08:44PM (2 children)
There actually is argument by fallacy [wikipedia.org] out there. Typical defenses against the above are either to point out that the fallacy doesn't qualify and why. Or to acknowledge the fallacy and redo the argument without the fallacy. Point is that if you're doing it right, you either can directly rebut the accusation of fallacy and move on, or come up with a successful argument that doesn't have that fallacy, and then move on.
(Score: 2) by JoeMerchant on Sunday October 26, @09:23PM (1 child)
> directly rebut the accusation of fallacy and move on, or come up with a successful argument that doesn't have that fallacy, and then move on.
Not specifically you, but I have found that the majority of people who throw those fallacy barbs out there don't care if you successfully prove them wrong or not, they'll just continue their position - often doubled down.
🌻🌻🌻🌻 [google.com]
(Score: 1) by khallow on Monday October 27, @11:05AM
Shrug. Welcome to conversation. I've been known to say something like "I've already explained why that's not a fallacy. How about we get back on topic?" and go from there.