Nvidia stock closes at record high, day before earnings

0 6

Sections of Nvidia changed so far +2.3% compared to yesterday. The record comes ahead of the company’s third-quarter fiscal results on Tuesday, when analysts expect revenue growth of more than 170%.

If that’s not impressive enough, the company’s forecast for the fourth fiscal quarter, according to LSEG estimates, is likely to show an even bigger number: nearly 200% growth.

Heading into the Thanksgiving holiday, Wall Street examines the company that has been at the center of this year’s artificial intelligence boom.

Nvidia’s stock price is up 237% in 2023, far more than any other member of the S&P 500. Its market cap is now $1.2 trillion, well above Meta or Tesla. Any indication on the earnings says enthusiasm for AI generation is cooling, or that some big customers are moving over to it AMD’s processors, or that restrictions in China are having a negative impact on the industry could spell trouble for stocks that have been on such a tear.

“High expectations lead to NVDA’s FQ3’24 earnings call on Nov-21,” Bank of America analysts wrote in a report last week. They have a buy rating on the stock and said they “expect a beat/up.”

However, they pointed to China’s restrictions and competitive concerns as two issues that will capture investors’ attention. In particular, AMD’s emergence in the AI ​​generation market is giving new impetus to Nvidia, which has mostly owned the AI ​​graphics processing unit (GPU) market.

AMD CEO Lisa Su said late last month that the company expects GPU revenue of about $400 million in the fourth quarter, and more than $2 billion in 2024. The company said in the June that the MI300X, the most advanced GPU for AI, would start shipping. to some customers this year.

Nvidia is still the market leader in GPUs for AI, but high prices are an issue.

“NVDA must strongly counter the assertion that their products are too expensive for AI generation decision making,” Bank of America analysts wrote.

Last week, Nvidia unveiled the H200, a GPU designed to train and deploy the kinds of AI models that are powering the next-generation AI explosion, allowing companies to develop smarter chatbots and simple text turn it into creative graphic design.

The new GPU is an upgrade from the H100, the OpenAI chip used to train the most advanced large language model, GPT-4 Turbo. H100 chips cost between $25,000 and $40,000, according to an estimate from Raymond James, and thousands of them are needed to work together to create the largest models in a process called “training.”

The H100 chips are part of Nvidia’s data center group, which saw revenue in the fiscal second quarter increase 171% to $10.32 billion. That accounted for about three quarters of Nvidia’s total revenue.

For the third fiscal quarter, analysts expect data center growth to nearly quadruple to $13.02 billion from $3.83 billion a year earlier, according to FactSet. Total revenue is expected to rise 172% to $16.2 billion, according to analysts previously surveyed by LSEG, Refinitiv.

According to current estimates, growth will peak in the fourth fiscal quarter at around 195%, LSEG estimates show. Expansion will remain strong through 2024 but is expected to slow each quarter of the year.

Executives can expect to raise questions about the earnings call related to the big change at OpenAI, creator of the chatbot ChatGPT, which was a key engine of Nvidia’s growth this year. On Friday, OpenAI’s board announced that CEO Sam Altman was suddenly fired over a dispute over the pace of the company’s product development and where it is focusing its efforts.

OpenAI is a big buyer of Nvidia GPUs, as it is Microsoft, the main sponsor of OpenAI. After a chaotic weekend, OpenAI said Sunday night that former Twitch CEO Emmett Shear would lead the company on an interim basis, and soon after Microsoft CEO Satya Nadella said that Altman and Outgoing OpenAI Chairman Greg Brockman joins to lead new advanced AI. research team.

Nvidia investors have so far dismissed China-related concerns despite their importance to the company’s business. The H100 and A100 AI chips were the first to be hit by new US restrictions last year aimed at curbing sales to China. Nvidia said in September 2022 that the US government would still allow it to develop the H100 in China, which accounts for 20% to 25% of its data center business.

The company has reportedly found a way to continue selling into the world’s second largest economy while complying with US regulations. The company is expected to deliver three new chips, based on the H100, to Chinese manufacturers, Chinese financial media Cailian Press reported last week, citing sources.

Nvidia has historically avoided giving annual guidance, preferring to look ahead straight to the next quarter. But with how much investors have poured into the company this year and how little else there is for them to follow this week, they’ll be listening closely to CEO Jensen Huang’s tone on the call. conference for any sign that the buzz is in AI generation. maybe it’s worn off.

Watch: EMJ’s Eric Jackson expects a positive report from Nvidia

Leave A Reply

Your email address will not be published.