Record sales of artificial intelligence chips sent Nvidia’s revenue soaring 262 per cent in the past quarter, beating lofty expectations, and its chief executive said its blockbuster growth was set to continue this year with the launch of a new line of chips.
Jensen Huang told investors the company would see “a lot” of revenue from its new Blackwell chips this year as it profits from exploding demand for the computing power behind generative AI.
Blackwell will contribute to a new phase of growth for the company, Huang said, adding that Nvidia would continue rolling out newer, more powerful chips at the same pace. “After Blackwell, there’s another chip, and we are on a one-year rhythm,” he said.
Demand for Nvidia’s AI data centre graphics processing units has skyrocketed in the past year as the biggest tech companies rush to develop the computing infrastructure needed to deliver powerful new AI products at scale. Google, Microsoft, Meta and Amazon have all indicated that their spending would remain high throughout 2024.
Revenue for the three months to the end of April was $26bn versus consensus estimates of $24.7bn. The enormous year-on-year increase was similar to the previous quarter, when growth hit 265 per cent. For the current quarter Nvidia expects revenue of about $28bn, plus or minus 2 per cent, compared with consensus estimates of $26.8bn.
Nvidia’s data centre revenue, which relates to its coveted AI chips, was up 427 per cent year on year for the quarter to $22.6bn, driven by strong demand for Nvidia’s current generation Hopper GPUs, Nvidia chief financial officer Colette Kress told investors. Shipments of the Blackwell chip are expected to start this quarter.
Nvidia shares, which have extended their blistering rise by more than 90 per cent since the start of the year, were up about 6.7 per cent in pre-market trading on Thursday. The chipmaker also announced a 10-for-1 stock split effective from June 7 and said it was raising its quarterly cash dividend 150 per cent.
Ahead of the results announcement, traders were bracing for big swings in Nvidia shares and markets more broadly. The stock’s massive rally has made it one of the most closely watched names on Wall Street. Since the start of 2023 its market capitalisation has increased more than six-fold to $2.3tn, overtaking Google parent Alphabet and Amazon to become the third-most valuable US-listed company.
Nvidia has been moving quickly to capitalise on the surge of demand for AI and stay ahead of competitors and customers that are developing their own AI chips. It revealed its Blackwell chips in March, which it says are twice as powerful as the current generation of chips for training AI models and offer five times the performance on “inference” — the speed at which such models can respond to queries. That came just a year after the company revealed its previous generation of GPU chip architecture, Hopper.
Analysts had questioned whether the transition to a new product line could affect the massive year-on-year growth that Nvidia recorded in previous quarters, as a temporary “air pocket” in demand emerges. The rapid pace of its chip releases has, for example, led Amazon to change plans to order chips based on the last generation of Nvidia’s architecture and replace them with the Blackwell line.
But Huang reassured investors that demand for both the Hopper and Blackwell lines was “way ahead of supply”, a situation that would probably continue “well into next year”.
Diluted earnings per share were $5.98, up more than 600 per cent from a year ago. Gross margin was 78.4 per cent, slightly better than the 77 per cent analysts had predicted, with net income at $14.9bn, ahead of expectations for $13.2bn.
Rivals AMD and Intel have been rolling out AI data centre chips to compete with Nvidia’s, as well as joining forces with Nvidia’s clients to offer alternatives to its software platform, Cuda, which cements its dominance as a chip supplier.
In April, Intel and AMD announced lacklustre first-quarter results and modest guidance, suggesting they are yet to reap the benefits of the explosion in demand. On Tuesday, Microsoft announced it would be using AMD’s new MI300X accelerator chips and its ROCm software to run some of the most demanding AI workloads on its Azure cloud service.
“Nvidia beat on the data centre [revenue] and beat across the board,” said Daniel Newman, chief executive of The Futurum Group. “The whole market was waiting for this number and Nvidia delivered.”
The stock split would create “more accessibility” as well as “additional momentum for the stock”, he added. “The AI trade is alive and well.”