Shares Nvidia It closed up 2.3% on Monday at an all-time high of $504. The record comes on the heels of the company's fiscal third-quarter results on Tuesday, when analysts expect revenue growth of more than 170%.
If that's not impressive enough, the company's forecast for the fiscal fourth quarter, LSEG estimates, is likely to show an even bigger number: nearly 200% growth.
In time for Thanksgiving, Wall Street will be taking a closer look at the company at the center of this year's AI boom.
Nvidia's stock price is up 245% in 2023, far more than any other member of the S&P 500. Its market cap is now $1.2 trillion, well above that. Meta or Tesla. Any indication of gains suggests that generative AI enthusiasm is waning, or that some big customers are moving away of AMD Processors, or the fact that China's restrictions are having a detrimental effect on the business, could cause problems for stocks that suffer such deterioration.
“Expectations are high, driving NVDA's FQ3'24 earnings call on Nov. 21,” Bank of America analysts wrote in a report last week. They have a buy rating on the stock and said they “expect a beat/raise.”
However, they cited China's restrictions and competitive concerns as two issues that will attract investors' attention. In particular, AMD's emergence into the generative AI market presents a new dynamic for Nvidia, which primarily owns the AI graphics processing unit (GPU) market.
AMD CEO Lisa Su said late last month that the company expects GPU revenue of about $400 million in the fourth quarter and more than $2 billion in 2024. The company announced in June that the MI300X, its most advanced GPU for AI, will begin shipping. This year some customers.
Nvidia is still far and away the market leader in GPUs for AI, but high prices are an issue.
“NVDA needs to be forced to counter the narrative that its products are too expensive to power AI,” Bank of America analysts wrote.
Last week, Nvidia unveiled the H200, a GPU designed to power learning and AI models that power the generative AI explosion, enabling companies to build smarter chatbots and turn simple text into creative graphic designs.
The new GPU is an upgrade to the H100, the chip OpenAI uses to power its most advanced large language model, the GPT-4 Turbo. H100 chips cost between $25,000 and $40,000, Raymond James estimates, and thousands of them work together to create the largest models in a process called “training.”
The H100 chips are part of Nvidia's data center group, whose fiscal second-quarter revenue grew 171% to $10.32 billion. That accounted for about three-quarters of Nvidia's total revenue.
For the fiscal third quarter, analysts expect data center growth to nearly quadruple to $13.02 billion from $3.83 billion a year ago, according to FactSet. Analysts polled by LSEG, formerly Refinitiv, expect total revenue to rise 172% to $16.2 billion.
Growth is currently estimated to peak in the fiscal fourth quarter at around 195%, LSEG estimates. Expansion will remain strong through 2024, but is expected to slow in each quarter of the year.
Executives can expect questions about the earnings call related to a massive shakeup at OpenAI, the maker of chatbot ChatGPT, which has been a key catalyst for Nvidia's growth this year. On Friday, OpenAI's board announced the sudden dismissal of CEO Sam Altman amid a dispute over the company's product development speed and where it is putting its efforts.
OpenAI is a big buyer of Nvidia GPUs as it is Microsoft, the main supporter of OpenAI. After a chaotic weekend, OpenAI announced Sunday night that former Twitch CEO Emmett Sheer would lead the company on an interim basis, and soon after, Microsoft CEO Satya Nadella said Altman and ousted OpenAI chairman Greg Brockman will join to lead the new advanced AI. research team.
Nvidia investors have so far dismissed concerns about China despite its potential importance to the company's business. The H100 and A100 AI chips were the first to be hit by new US restrictions last year aimed at curbing sales in China. Nvidia said in September 2022 that the US government would again allow it to develop the H100 in China, which accounts for 20% to 25% of its data center business.
The company has reportedly found a way to continue selling in the world's second-largest economy while complying with US rules. The company plans to supply three new chips based on the H100 to Chinese manufacturers, according to Chinese financial media. Cailian Press reported last week, citing sources.
Nvidia has historically shied away from annual guidance, preferring to only look forward to the next quarter. But given how much money investors have poured into the company this year and how little they have this week, they'll be listening closely to CEO Jensen Huang's tone on the conference call for any sign that generative AI is buzzing. can be worn
WATCHING: EMJ's Eric Jackson expects a good report from Nvidia