By Karen Weise
The spending that the industry’s giants expect artificial intelligence to require is starting to come into focus — and it is jarringly large.
Click here to follow our WhatsApp channel
If 2023 was the tech industry’s year of the A.I. chatbot, 2024 is turning out to be the year of A.I. plumbing. It may not sound as exciting, but tens of billions of dollars are quickly being spent on behind-the-scenes technology for the industry’s A.I. boom.
Companies from Amazon to Meta are revamping their data centers to support artificial intelligence. They are investing in huge new facilities, while even places like Saudi Arabia are racing to build supercomputers to handle A.I. Nearly everyone with a foot in tech or giant piles of money, it seems, is jumping into a spending frenzy that some believe could last for years.
Microsoft, Meta, and Google’s parent company, Alphabet, disclosed this week that they had spent more than $32 billion combined on data centers and other capital expenses in just the first three months of the year. The companies all said in calls with investors that they had no plans to slow down their A.I. spending.
In the clearest sign of how A.I. has become a story about building a massive technology infrastructure, Meta said on Wednesday that it needed to spend billions more on the chips and data centers for A.I. than it had previously signaled.
“I think it makes sense to go for it, and we’re going to,” Mark Zuckerberg, Meta’s chief executive, said in a call with investors.
The eye-popping spending reflects an old parable in Silicon Valley: The people who made the biggest fortunes in California’s gold rush weren’t the miners — they were the people selling the shovels. No doubt Nvidia, whose chip sales have more than tripled over the last year, is the most obvious A.I. winner.
The money being thrown at technology to support artificial intelligence is also a reminder of spending patterns of the dot-com boom of the 1990s. For all of the excitement around web browsers and newfangled e-commerce websites, the companies making the real money were software giants like Microsoft and Oracle, the chipmaker Intel, and Cisco Systems, which made the gear that connected those new computer networks together.
But cloud computing has added a new wrinkle: Since most start-ups and even big companies from other industries contract with cloud computing providers to host their networks, the tech industry’s biggest companies are spending big now in hopes of luring customers.
Google’s capital expenditures — largely the money that goes into building and outfitting data centers — almost doubled in the first quarter, the company said. Microsoft’s were up 22 percent. Amazon, which will report earnings on Tuesday, is expected to add to that growth.
Meta’s investors were unhappy with Mr. Zuckerberg, sending his company’s share price down more than 16 percent after the call. But Mr. Zuckerberg, who just a few years ago was pilloried by shareholders for a planned spending spree on augmented and virtual reality, was unapologetic about the money that his company is throwing at A.I. He urged patience, potentially for years.
“Our optimism and ambitions have just grown quite a bit,” he said.
Investors had no problem stomaching Microsoft’s spending. Microsoft is the only major tech company to report financial details of its generative A.I. business, which it said had contributed to more than a fifth of the growth of its cloud computing business. That amounted to $1 billion in three months, analysts estimated.
Microsoft said its generative A.I. business could have been even bigger — if the company had enough data center supply to meet the demand, underscoring the need to keep on building.
The A.I. investments are creating a halo for Microsoft’s core cloud computing offering, Azure, helping it draw new customers. “Azure has become a port of call for pretty much anybody who is doing any A.I. project,” Satya Nadella, Microsoft’s chief executive, said on Thursday.
(The New York Times sued Microsoft and its partner, OpenAI, in December, claiming copyright infringement of news content related to their A.I. systems.)
Google said sales from its cloud division were up 28 percent, including “an increasing contribution from A.I.”
In a letter to shareholders this month, Andy Jassy, Amazon’s chief executive, said that much attention had been paid to A.I. applications, like ChatGPT, but that the opportunity for more technical efforts, around infrastructure and data, was “gigantic.”
For the computing infrastructure, “the key is the chip inside it,” he said, emphasizing that bringing down costs and wringing more performance out of the chips is key to Amazon’s effort to develop its own A.I. chips.
Infrastructure demands generally fall into two buckets: First, there is building the largest, cutting-edge models, which some A.I. developers say could soon top $1 billion for each new round. Chief executives said that being able to work on developing cutting-edge systems, either directly or with partners, was essential for remaining at the forefront of A.I.
And then there is what’s called inferencing, or querying the models to actually use them. This can involve customers tapping into the systems, like an insurer using generative A.I. to summarize a customer complaint, or the companies themselves putting A.I. directly into their own products, as Meta recently did by embedding a chatbot assistant in Facebook and Instagram. That’s also expensive.
Data centers take time to build and outfit. Chips face supply shortages and costly fabrication. With such long-term bets, Susan Li, Meta’s finance chief, said the company was building with “fungibility.” It wants wiggle room to change how it uses the infrastructure, if the future turns out to be not exactly what it expects.