An interesting article on the economics of AI Chips by Mihir Kshirsagar
This week, Open AI announced a multibillion-dollar deal with Broadcom to develop custom AI chips for data centers projected to consume 10 gigawatts of power. This investment is separate from another multibillion-dollar deal OpenAI struck with AMD last week. There is no question that we are in the midst of making one of the largest industrial infrastructure bets in United States history. Eight major companies—Microsoft, Amazon, Google, Meta, Oracle, OpenAI, and others—are expected to invest over $300 billion in AI infrastructure in 2025 alone. Spurred by news about the vendor-financed structure of the AMD investment and a conversation with my colleague Arvind Narayanan, I started to investigate the unit economics of the industry from a competition perspective.
What I have found so far is surprising. It appears that we're making important decisions about who gets to compete in AI based on financial assumptions that may be systematically overstating the long-run sustainability of the industry by a factor of two. That said, I am open to being wrong in my analysis and welcome corrections as I write these thoughts up in an academic article with my colleague Felix Chen.
Here is the puzzle: the chips at the heart of the infrastructure buildout have a useful lifespan of one to three years due to rapid technological obsolescence and physical wear, but companies depreciate them over five to six years. In other words, they spread out the cost of their massive capital investments over a longer period than the facts warrant—what The Economist has referred to as the "$4trn accounting puzzle at the heart of the AI cloud."
Center for Information Technology Policy (Princeton University)
(Score: 0) by Anonymous Coward on Wednesday October 22, @01:07AM (3 children)
Same for that human distance running thing. It's better for war, not really better for hunting for food. Most "primitive" human tribes use their brains to catch food and don't run for hours.
(Score: 1) by khallow on Wednesday October 22, @02:19AM (2 children)
(Score: 0) by Anonymous Coward on Wednesday October 22, @04:16PM (1 child)
The hunter/gatherer lifestyle is terrible. You need farming and agriculture so that one person can produce enough food to support way more than a few other people.
A hunter gatherer society wouldn't be able to sustain a lot of nice stuff back in the past (if living in towns and cities back then was so terrible and worse than being a hunter-gatherer there wouldn't be so many people living in towns and cities); and in the modern day like the SN website, the Internet, vaccines, antibiotics, hospitals, solar panels. etc.
It also would be unlikely to lead towards humans developing space colonies and related tech, which would arguably be necessary to significantly delay the extinction of humans.
To try to achieve modern tech capabilities with just food produced via hunter-gatherer style stuff would likely cause even greater devastation to flora and fauna. Imagine how many pigs and chickens you'd have to hunt in the forests. How many berries etc would you have to forage and gather.
(Score: 1) by khallow on Thursday October 23, @01:40AM
Recall my thing about scale? If you don't need to support way more than a few people or the need to support fancy infrastructure, then farming and agriculture just doesn't have that much of a draw. Similarly, a low grade hunter/gatherer presence would have relatively low environmental impact (unless you are a large, tasty animal).
Keep in mind that the propaganda greatly predates the space age. Ancient cities of Egypt, China, and Mesopotamia touted their advantages long before.