Sunday 08 Sep 2024
By
main news image

This article first appeared in The Edge Malaysia Weekly on September 11, 2023 - September 17, 2023

FOR the second time in three months, US graphics chip giant Nvidia Corp reported a blowout quarterly earnings on Aug 23. The company is benefiting from an insatiable demand for artificial intelligence (AI) chips.

The demand is coming not just from firms that are trying to take advantage of the boom in generative AI — such as Microsoft Corp and Google or giant cloud infrastructure players such as Amazon.com Inc. It is also coming from start-ups and sovereign wealth funds in Saudi Arabia and Abu Dhabi, which are hoarding AI chips that they can resell to smaller firms that cannot get access to the AI chipsets that they need.

Quarterly revenues of US$13.5 billion (RM63.1 billion) were up 101% over the past year and 88% over the previous quarter while gross margins of 71.2% were up from consensus estimates of just 60%.

What is driving the demand? Microsoft, which needs to support the ChatGPT infrastructure and has been ramping up generative AI co-pilots, grew its total capital expenditure US$2.3 billion, or 35% sequentially, in the last quarter. Microsoft is now by far Nvidia’s largest customer accounting for 22% of the chip giant’s total revenues. Other global hyperscalers — or firms such as Amazon Web Services, Google Cloud, Oracle Corp and Meta Platforms Inc that offer large-scale cloud services solutions and can rapidly scale their infrastructure to accommodate the growing demands of their users or their own — together bought as many Nvidia graphic processing units, or GPU chips, as those of Microsoft last quarter.

For its part, Nvidia believes that, over the long term, the total addressable market for AI will exceed US$600 billion — US$300 billion in chips and systems, US$150 billion in generative AI software, and US$150 billion in omniverse enterprise software to enable development, deployment and management of advanced 3D applications. Manuvir Das, Nvidia’s vice-president of enterprise computing, told a Goldman Sachs Tech conference last week that “accelerated computing” is not just about the chips; it is about the whole stack. “If you think about the traditional computing systems, what has changed over the decades is simply the location — you’re in the cloud, you’re doing it on your phone, but it’s essentially the same style of computing”, through a central processing unit (CPU).

Increasingly, companies are using computing to do everything in the cloud. “More computing means you need more data centres, you need more energy, you need more horsepower, and it’s just not sustainable.” What Nvidia is trying to do, he argues, is do it better and more sustainably. “We’re saying that with accelerated computing, or the same footprint, we can do 10 times, 100 times the work — and that’s going to be the only way,” Das says, adding that, with its AI chipsets, Nvidia is basically helping companies “go digital and grow more efficiently in previously unimaginable ways”.

Founded by three chip design engineers, Jensen Huang, Chris Malachowsky and Curtis Priem in 1993, Nvidia just happened to be in the right place at the right time. In 2000, Nvidia rightly bet that gaming chips were going to be the next big growth driver for the semiconductor industry. Nvidia’s chips were in Microsoft’s earliest Xbox game consoles. Eight years ago, Nvidia rightly bet that crypto mining was a nice niche for its expanding GPUs. For the past five years, Nvidia has been investing to improve its GPUs so that they can be used for accelerated computing in AI applications. Nvidia is an opportunistic chip company that bet three times, and each time, its bet has paid off in a big way, notes Prof Aswath Damodaran of New York University’s Stern Business School. “The first time around, you could say they were lucky; but, by the time you see them doing it for the third time, you know it is by design,” he argues.

“Nvidia had two near-death experiences since it listed in 1999, with its stock price plunging 80% in 2000 and in 2001,” Damodaran notes. “But it bounced back, made new bigger bets and eventually succeeded.”

In 2018, Nvidia stock was down 40% and, between November 2021 and October 2022, its shares again plunged nearly 70%, only to more than quadruple since then. Investors abandoned Nvidia but the management knew that their AI chip had incredible potential. It was only a matter of time before the world woke up to realise what Nvidia had in its hands.

Yet, the chip industry is notorious for its feast-to-famine cycles. Chip makers discover a great niche, sell a ton of chips and make tens of billions of dollars only to see competitors flock in and make that segment of semiconductors a commodity of sorts. The next wave of chip firms is then forced to move on to a new niche.

Twenty years ago, Japanese chip makers were feared in Silicon Valley. Today, Japan is a minnow in semiconductors, behind the US, China, Taiwan and South Korea. From the early 1980s until a few years ago, Intel was by far the world’s largest chipmaker. Today, Nvidia is eight times Intel’s size. Communications chip maker Broadcom Inc, memory chip maker Samsung Electronics Co Ltd, chip foundry Taiwan Semiconductor Manu­facturing Co (TSMC), and Nvidia’s closest rival Advanced Micro Devices Inc (AMD), are all far bigger than Intel, which has been struggling for relevancy.

Providing the right tools

Nvidia is thriving right now because it is providing the tools for the AI boom. In 1999, at the height of the dotcom boom, the company that reached the highest valuation was not an internet firm such as Yahoo! or an e-commerce one like Amazon.com; it was Cisco Systems Inc, which provided routers and networking gear for internet service providers. It was a classic pick-and-shovel player. When I talk to venture capitalists and techpreneurs in the US, they often mention the gold rush that began in 1848. More than 300,000 people were lured to California as people searched for gold. Most money was not made by mining companies that dug out the gold but by the ones that provided picks, shovels and pans to dig the shiny metal out of the ground. Another big beneficiary was Levi Strauss, which sold jeans to gold diggers.

Will the current AI spending spree end like the dotcom bubble burst in 2000, when fibre-optic firms such as Global Crossing spent tens of billions building infrastructure to provide broadband internet? “We don’t think so,” says Mark Lipacis, semiconductor analyst for Jefferies & Co in San Francisco.

“The growth rates of Nvidia today and the fibre-optics rollout in the late 1990s are similar, but what is different is the maturity of the business models. In the late 1990s, competitive carriers put dark fibre in the ground and equipment in central offices before business models were proven — ‘build it and they will come’. Today, there are companies that are generating a return using generative AI through product enhancements, productivity gains or outright cost reductions.”

After the dotcom bubble burst, emerging internet firm Google bought that fibre, and the main beneficiary of the overbuilding of fibre was YouTube, which needed a lot of bandwith to push millions of videos.

Moreover, unlike all the dark fibre that was being laid out under the sea and underground in the late 1990s, which was not used until 10 years later, Nvidia and other AI chip makers intend to roll out newer versions of more powerful and sophisticated chips each year that companies such as Microsoft, Google and Amazon can use in their new generative AI applications. You cannot leave an AI chip for 10 years and hope that, some day in the future, like fibre, it too will be put to good use. So, as more sophisticated chips are rolled out, both start-ups and tech giants will be forced to come up with new uses for those chips.

Chip users such as Tesla Inc, Google, Amazon and Microsoft as well as chip makers AMD and Intel are readying their own AI chips, though it might be at least 18 months before a viable alternative to Nvidia hits the market. By then, Nvidia would have rolled out two more iterations of its own AI chips.

As the world’s biggest user of AI chips, Microsoft has also emerged as a huge investor in AI chip start-ups that are challenging Nvidia. Just last week, the software powerhouse joined Singapore’s Temasek Holdings in a US$110 million funding round for d-Matrix, a Silicon Valley-based AI chip start-up. A year ago, d-Matrix, which builds AI chips for data centres, raised US$44 million. Other companies are raising money at stratospheric valuations. Start-up cloud GPU provider, CoreWeave Inc, a large customer of Nvidia’s AI chips, is reportedly looking at a stake sale later this month that would value it at up to US$8 billion, up from the US$2.2 billion it was valued at in April. Privately held CoreWeave, which offers services to support high-performance computing, have gained significant traction amid the recent generative AI boom.

For now, there is visibility of demand for AI chips for the rest of this year and much of next year. The key question for Nvidia, says Pierre Ferragu, tech hardware analyst at NewStreet Research, “is how much more GPU spending hyperscalers and other buyers can afford beyond 2024”.

Will the AI bubble burst soon?

What of Nvidia’s runaway stock? It is up 230% this year, while the Philadelphia Semi­conductor Index (SOX), is up 46.2% year to date and the broader Standard & Poor’s 500 index is up 17% since January. Are we in an AI bubble that is about to burst soon?

Stacy Rasgon, semiconductor analyst at Sanford C Bernstein & Co, sees Nvidia’s revenues surging to US$53.48 billion in the current financial year ending January 2024, from US$26.97 billion in the last financial year, accelerating to US$69.05 billion next year; and its annual free cash flow is seen to be growing from US$3.8 billion last year to US$34.97 billion next year. Rasgon has a 12-month price target of US$675 on the hot chip stock. He is actually a bit stingy on the price target. At least one analyst, Rosenblatt Securities’ Hans Mosesmann, has a 12-month price target of US$1,100, or a whopping 134% upside.

Timothy Arcuri, chip analyst at UBS, forecasts that Nvidia’s revenues will surge to US$97.4 billion in the next financial year and net earnings will top US$53.5 billion. (To put this in perspective, just three years ago, Nvidia had a mere US$16.6 billion in annual revenues and US$6.2 billion in net profits.) “Everyone has been looking for ways to play AI that aren’t as expensive as Nvidia, given the run this year,” Rasgon notes. Yet, the Bernstein analyst argues, buying Nvidia “itself remains the best way to accomplish that, given the magnitude of earnings revisions.” Nvidia’s stock, he believes, “will still come out cheaper than it was” before its Aug 23 earnings announcement. That is because analysts have been rushing to revise Nvidia’s earnings ever upwards faster than the stock’s price appreciation itself in recent months.

Nvidia’s stock closed at US$470.61 on Sept 6, or just 31.6 times the current financial year’s consensus forecast earnings. In contrast, retailer Costco Wholesaler Corp’s stock trades at 35.6 times estimated earnings and has net margins of 2.55%. Even its closest chip peer, AMD, trades at 32.5 times forecast earnings. Did I mention that AMD has gross margins of around 45.5% compared to Nvidia’s 71.5%?

By the way, Microsoft, the world’s largest software firm, reported gross margins of 68.9% in the last quarter. When a semiconductor hardware firm has better margins than the world’s top software firm, investors sit up and take notice.

Make no mistake, the AI revolution is already here. After smartphones were launched in 2007, it took a few years before services such as ride hailing, food delivery and short-term  home rental like Airbnb emerged. After the advent of 5G in 2018, it took years before the self-driving robotaxis service began  in California and robo-surgeries are becoming fairly common. It will be a while before ubiquitous AI-linked services become apparent.

 

Assif Shameen is a technology writer based in North America 

 

Save by subscribing to us for your print and/or digital copy.

P/S: The Edge is also available on Apple's App Store and Android's Google Play.

      Print
      Text Size
      Share