Connect with us

Bussiness

Can Nvidia stay at the heart of the new AI economy?

Published

on

Can Nvidia stay at the heart of the new AI economy?

Adam, 44, had only ever bought shares in one other company when he decided to invest in a stock called Nvidia last month after a “hot tip” from a friend.

“It’s AI and clearly there’s money in that,” says Adam, who works in the hospitality industry in London and asked not to be named because his family is not aware of his stock trading.

While he struggles to remember the company’s name, how to pronounce it (it’s en-vid-iya) or even exactly what it does in artificial intelligence, “this is the future, it’s the stuff of Cyberdyne Systems,” Adam says, referencing the world-ending AI company of the Terminator movies. “People are a little bit dazzled.”

If Nvidia is only now capturing the popular imagination, it has long held the attention of Wall Street. The 31-year-old chipmaker this week shot past Apple and Microsoft to briefly become the world’s most valuable company, worth as much as $3.3tn.

Explosive demand for its graphics processing units, which are generally considered the best way to build large AI systems by the likes of Meta and Microsoft, has pushed its stock price up by around 700 per cent since the launch of OpenAI’s hit ChatGPT chatbot in November 2022.

The unprecedented rise of a company that until recently was obscure to most people outside the tech industry reflects the AI fervour that has gripped Silicon Valley and Wall Street in equal measure. But its return to third place after only a couple of days underscores the fierce competition in this new tech arena.

Nvidia’s surge is the story of the AI economy: its explosive growth, its appeal to investors and its unpredictable future. Where it goes next is set to mirror — and perhaps determine — that economy’s path.


The last time a company with a brand as relatively obscure as Nvidia’s occupied this position was in March 2000, when Cisco, which makes networking gear, overtook Microsoft at the apogee of the dotcom bubble.

Now, as then, companies are ploughing billions of dollars into building infrastructure for a promised revolution in not only computing but the global economy. Like Nvidia, Cisco struck gold selling the digital picks and the shovels to internet prospectors. But its stock price has never returned to its 2000 peak, after the bubble burst later that year.

The fact that Big Tech’s capital spending surge on AI is based more on revenue projections than actual returns has stoked fears of history repeating itself.

262%Nvidia’s revenue growth, year-on-year, in its most recent quarter

“I understand the worry,” says Bernstein analyst Stacy Rasgon, but there are crucial differences. “The worry with Cisco was that they were building out a bunch of capacity for demand that they were hoping for, and even today there’s fibre buried in the ground that they never used.”

Rasgon adds that, compared with Cisco’s price at the height of the dotcom bubble, Nvidia’s shares are trading at a far lower multiple of forecast earnings.

Companies such as Microsoft are already seeing some returns on their investment in AI chips, even if others such as Meta have warned it will take longer. If an AI bubble is forming, Rasgon adds, a pop does not appear imminent.

Cisco’s dotcom-era rise and fall stands in contrast with Apple and Microsoft. The two tech companies have competed for Wall Street’s highest perch for years, not only by making highly successful products, but also by building platforms that support massive business ecosystems. Apple has said there are around 2mn apps on the App Store, driving hundreds of billions of developer revenues each year.

A group of executives stand in the trading hall of the Nasdaq exchange under a Cisco sign
Cisco’s dotcom-era rise and fall stands in contrast with Apple and Microsoft © San Francisco Chronicle/AP

The Nvidia economy looks very different to the one surrounding Apple. In many ways, the popularity of a single app — ChatGPT — is responsible for much of the investment that has driven Nvidia’s stock price upwards in the past few months. The chipmaker says it has 40,000 companies in its software ecosystem and 3,700 “GPU-accelerated applications”.

Instead of selling hundreds of millions of affordable electronic devices to the masses every year, Nvidia has become the world’s most valuable company by selling a relatively small number of expensive AI chips for data centres, primarily to just a handful of companies.

Large cloud computing providers such as Microsoft, Amazon and Google accounted for almost half of Nvidia’s data centre revenues, the company said last month. According to chip analyst group TechInsights, Nvidia sold 3.76mn of its graphics processing unit chips for data centres last year. That was still enough to give it a 72 per cent share of that specialist market, leaving rivals such as Intel and AMD far behind.

Still, those sales are growing fast. Nvidia’s revenues grew 262 per cent year over year to $26bn in its most recent quarter, ending in April, a faster pace even than Apple in the early years of the iPhone.

Demand for Nvidia’s products has been fuelled by tech companies that are seeking to overcome questions about AI’s capabilities by throwing chips at the problem.

In pursuit of the next leap forward in machine intelligence, companies such as OpenAI, Microsoft, Meta and Elon Musk’s new start-up xAI are racing to construct data centres connecting as many as 100,000 AI chips together into supercomputers — three times as large as today’s biggest clusters. Each of these server farms costs $4bn in hardware alone, according to chip consultancy SemiAnalysis.

Hunger for greater computing capacity for AI is not going away. Nvidia’s chief executive, Jensen Huang, predicts that more than $1tn will be spent retooling existing data centres and building what he calls “AI factories” in the coming years, as everyone from Big Tech companies to nation states builds their own AI models.


That scale of investment will only continue if Nvidia’s customers figure out how to make money from AI themselves. And at just the moment the company reached the top of the stock market, more people in Silicon Valley are starting to question whether AI can live up to the hype.

David Cahn, a partner at Sequoia, one of Silicon Valley’s biggest start-up investors, warned in a blog post this week of a “speculative frenzy” surrounding AI and the “delusion” that “we’re all going to get rich quick” from advanced AI and stockpiling Nvidia chips.

Even though he predicts a huge economic value from AI, Cahn estimates Big Tech companies will collectively need to generate hundreds of billions of dollars more a year in new revenues to recoup their investment in AI infrastructure at its current accelerating pace. For the likes of Microsoft, Amazon Web Services and OpenAI, incremental sales from generative AI are generally projected to run in the single-digit billions this year.

A man in a black leather jacket holds up a circuit board
Nvidia’s chief executive Jensen Huang predicts that more than $1tn will be spent retooling existing data centres and building what he calls ‘AI factories’ in the coming years © AFP via Getty Images

The period when tech executives could make grand promises about AI’s capabilities is “coming to an end”, says Euro Beinat, global head of AI and data science at Prosus Group, one of the world’s largest tech investors. “There’s going to be a lot more realism in the next 16 to 18 months about what we can and cannot do.”

Nvidia is never likely to be a mass-market consumer company like Apple. But analysts say if it is to continue to thrive it must emulate the iPhone maker and build out a software platform that will bind its corporate customers to its hardware.

“The argument that Nvidia is not just going to just blow out and be a Cisco — once the hardware hype cycle fizzles out — has to be tied to the software platform,” says Ben Bajarin of Silicon Valley-based consultancy Creative Strategies.

Huang has long argued that Nvidia is more than just a chip company. Instead, it provides all the ingredients to build “an entire supercomputer”, he has said. That includes chips, networking equipment and its Cuda software, which lets AI applications “talk” to its chips and is seen by many as Nvidia’s secret weapon.

In March, Huang unveiled Nvidia Inference Microservices, or NIM: a set of ready-made software tools for businesses to more easily apply AI to specific industries or domains.

Huang said these tools could be understood as the “operating system” for running large language models like the ones that underpin ChatGPT. “We’ll produce NIMs at a very large scale is my guess,” he said, predicting that its software platform — dubbed Nvidia AI Enterprise — “is going to be a very large business”.

A supercomputer sits in a white room
Nvidia’s Cambridge-1 supercomputer is designed to assist the UK’s life sciences industry with modelling and research © Nvidia

Nvidia has previously given away its software for free but now plans to charge businesses to deploy Nvidia AI Enterprise, at a cost of $4,500 per GPU per year. The effort is crucial to bringing in more corporate or government customers that lack the in-house AI expertise of a Big Tech company.

The problem for Nvidia is that many of its biggest customers also want to “own” that relationship with developers and build their own AI platform. Microsoft wants developers building on its Azure cloud platform. OpenAI has launched GPT Store, modelled on the App Store, offering customised versions of ChatGPT. Amazon and Google have their own developer tools, as do AI start-ups Anthropic, Mistral and many more.

That is not the only way in which Nvidia is coming into competition with its biggest customers. Google has developed a custom AI accelerator chip, the Tensor Processing Unit, and Amazon and Microsoft have followed with their own. Though these are small in scale, the TPU especially shows that it is possible for customers to loosen dependence on Nvidia.

In turn, Nvidia is cultivating potential future rivals to its Big Tech customers, in a bid to diversify its ecosystem. It has funnelled its chips to the likes of Lambda Labs and CoreWeave, cloud computing start-ups that are focused on AI services and rent out access to Nvidia GPUs, as well as directing its chips to local players such as France-based Scaleway, over the multinational giants.

Those moves form part of a broader acceleration of Nvidia’s investment activities across the booming AI tech ecosystem. In the past two months alone it has participated in funding rounds for Scale AI, a data labelling company that raised $1bn, and Mistral, a Paris-based OpenAI rival that raised €600mn.

PitchBook data shows Nvidia has struck 116 such deals over the past five years. As well as potential financial returns, taking stakes in start-ups gives Nvidia an early look at what the next generation of AI might look like, helping to inform its own product road map.

“[Huang] is neck-deep in details about AI trends, and what they could mean,” says Kanjun Qiu, chief executive of AI research lab Imbue, which Nvidia backed last year. “He’s built a huge team to work with AI labs directly so he can understand what they’re trying to build, even though they aren’t his customers.”

It is this kind of long-term thinking that put Nvidia at the centre of the current AI boom. But Nvidia’s journey to become the world’s most valuable company came with several near-death experiences along the way, Huang has said, and in the cut-throat market of Silicon Valley, no company is assured survival.

Continue Reading