The CEO of one of China’s top chipmakers just sent a stark warning against big tech’s unprecedented race to build out as many AI data centers as it can.
“Companies would love to build 10 years’ worth of data center capacity within one or two years,” Semiconductor Manufacturing International Corp.’s Co-CEO Zhao Haijun said Wednesday on a call with analysts, per Bloomberg. “As for what exactly these data centers will do, that hasn’t been fully thought through.”
Just Alphabet, Microsoft, Meta, and Amazon alone are expected to spend nearly $700 billion on AI this year, repeatedly assuring investors that demand outpaces supply. Over the next three years, the price tag of the data center buildout is expected to exceed $3 trillion.
Those financial commitments have started to spook investors. Meanwhile, the picture that is gradually forming has led to the current AI hype being compared to the dot-com bubble.
Invigorated by the excess investment flowing into building out the internet and anticipated demand for internet services, the telecommunications industry spent billions of dollars to lay out fiber optic cables in the late 1990s. By 2002, the dot-com bubble had long burst, and still less than 5% of the fiber optic network was reportedly in use. A telecoms crash quickly followed the dot-com bubble burst, and these unused fiber networks, called “dark fiber,” sat idle for years.
Eventually, though, the demand for the internet did pan out as expected, just years late. As demand for the internet grew exponentially, a lot of the infrastructure that was built out in the late 90s was eventually used.
For AI, though, the situation might be a little different: AI chips used in these data centers have a relatively clear expiration date, and if demand doesn’t pan out as expected before these chips expire, then that means these companies will have flushed a whole lot of money and resources down the drain.
Meta says its chips are now good for roughly five and a half years, up from a previous estimate of four years. Nvidia executives have claimed that the company’s chips that were shipped out six years ago are still in full use.
But even if the chips themselves are fine, their value also depreciates as new, better models get churned out. At its current rate, Nvidia releases a new flagship AI chip every year.
The depreciation of the value of chips is factored into company earnings, but industry experts are having a hard time seeing eye to eye on whether the current estimates are realistic.
Some argue that a six-year depreciation cycle is perfectly reasonable, and older GPUs will still be desirable as cheaper alternatives when newer, more advanced versions come out. On the other hand, you have investors like Michael Burry (of “The Big Short” fame) who claim that the actual useful life of an AI chip is no more than 2-3 years.
“By my estimates, they will understate depreciation by $176 billion 2026-2028,” Burry said in a post on X.
In its latest annual report, Microsoft said its “computer equipment” had an estimated useful life ranging anywhere from two to six years.
Trending Products
CHONCHOW 87 Keys TKL Gaming Keyboar...
Lenovo Ideapad Laptop Touchscreen 1...
Logitech MK235 Wi-fi Keyboard and M...
Amazon Fundamentals – 27 Inch...
ASUS RT-AX1800S Dual Band WiFi 6 Ex...
Acer CB272 Ebmiprx 27″ FHD 19...
Wireless Keyboard and Mouse Combo, ...
ASUS 24 Inch Desktop Monitor –...
SAMSUNG 32″ Odyssey G55C Seri...