Actions of Nvidia closed up 2.3% to an all-time high of $504.20 on Monday. The record comes ahead of the company’s third-quarter financial results released Tuesday, when analysts expect revenue growth of more than 170%.
As if that wasn’t astonishing enough, the company’s forecast for the fiscal fourth quarter, according to LSEG estimates, is expected to show an even higher number: growth of almost 200%.
As the Thanksgiving holiday approaches, Wall Street will be closely scrutinizing the company that has been at the heart of this year’s artificial intelligence boom.
Nvidia’s stock price has soared 237% in 2023, far outpacing that of any other member of the S&P 500. Its market cap now stands at $1.2 trillion, well above that. Meta Or You’re here. Any indication of results suggests that enthusiasm for generative AI is waning or that some large customers are turning to AMD processors, or that Chinese restrictions are having a detrimental effect on the company, could cause problems for a stock that is in such deep trouble.
“Expectations are high ahead of NVDA’s third quarter 2024 earnings conference call on November 21,” Bank of America analysts wrote in a report last week. They have a buy rating on the stock and said they “expect a beat/raise.”
However, they flagged Chinese restrictions and competition concerns as two issues that will attract investors’ attention. In particular, AMD’s emergence in the generative AI market presents new dynamics for Nvidia, which has primarily reserved the graphics processing unit (GPU) market for itself for AI.
AMD CEO Lisa Su said late last month that the company expects GPU revenue of around $400 million in the fourth quarter, and more than $2 billion in 2024. The company said in June that the MI300X, its most advanced GPU for AI, would begin shipping. to some customers this year.
Nvidia is still by far the market leader in AI GPUs, but high prices are a problem.
“NVDA must forcefully counter the narrative that its products are too expensive for AI generative inference,” Bank of America analysts wrote.
Last week, Nvidia unveiled the H200, a GPU designed to train and deploy the types of AI models that are fueling the explosion of generative AI, allowing businesses to develop smarter chatbots and convert simple text in creative graphic designs.
The new GPU is an upgrade to the H100, the chip used by OpenAI to train its most advanced extended language model, GPT-4 Turbo. H100 chips cost between $25,000 and $40,000, according to a Raymond James estimate, and thousands of them working together are needed to create the largest models in a process called “training.”
The H100 chips are part of Nvidia’s data center group, whose second-quarter revenue jumped 171% to $10.32 billion. That’s about three-quarters of Nvidia’s total revenue.
For the fiscal third quarter, analysts expect data center growth to nearly quadruple to $13.02 billion, up from $3.83 billion a year earlier, according to FactSet. Total revenue is expected to rise 172% to $16.2 billion, according to analysts surveyed by LSEG, formerly Refinitiv.
Based on current estimates, growth will peak in the fiscal fourth quarter at around 195%, according to LSEG estimates. Expansion will remain robust through 2024, but is expected to slow in each quarter of the year.
Executives can expect to field questions on the earnings call related to the massive shakeup at OpenAI, the creator of chatbot ChatGPT, which has been a major catalyst for Nvidia’s growth this year. On Friday, OpenAI’s board of directors announced the sudden dismissal of CEO Sam Altman due to disputes over the speed of the company’s product development and the focus of its efforts.
OpenAI is a big buyer of Nvidia GPUs, as is Microsoft, the main backer of OpenAI. After a chaotic weekend, OpenAI said Sunday evening that former Twitch CEO Emmett Shear would lead the company on an interim basis, and shortly after, Microsoft CEO Satya Nadella said Altman and ousted OpenAI president Greg Brockman would join to lead new advanced AI. the research team.
Nvidia investors have so far shrugged off China-related concerns, despite the potential importance to the company’s business. The H100 and A100 AI chips were the first to be hit last year by new U.S. restrictions aimed at curbing sales to China. Nvidia said in September 2022 that the U.S. government would still allow it to develop the H100 in China, which accounts for 20% to 25% of its data center business.
The company has reportedly found a way to continue selling its products in the world’s second-largest economy while complying with U.S. rules. The company is set to deliver three new chips, based on the H100, to Chinese manufacturers, according to Chinese financial media Cailian Press reported last week, citing sources.
Nvidia has historically avoided providing annual forecasts, preferring to focus only on the following quarter. But given how much money investors have poured into the company this year and how little else there is to watch this week, they will be listening closely to CEO Jensen Huang’s tone on the conference call for any signs that the buzz around generative AI may be fading.
WATCH: EMJ’s Eric Jackson expects a good report from Nvidia