Nvidia (NVDA -2.09%) is the leading supplier of graphics processing units (GPUs) for data centers, which are used to develop artificial intelligence (AI) models. Demand is heavily outstripping supply for those chips, and it's driving an incredible surge in the company's financial results.
Nvidia CEO Jensen Huang believes data center operators will invest $1 trillion over the next four years to upgrade their infrastructure in order to meet demand from AI developers. That estimate might be conservative based on forecasts from other sources, which I'll highlight later in this piece.
I think there is a clear mathematical path for Nvidia stock to soar by another 82%, which would take it comfortably above $200 in 2025. Here's why.
Blackwell shipments are likely to ramp up very quickly
Nvidia's H100 GPU was so popular in 2023 that the company had a whopping 98% share in the market for AI data center chips. It's still a top seller today, but data center operators are lining up to buy Nvidia's latest chips which are built on its new Blackwell architecture. They offer a significant leap in performance, meaning they can process high volumes of data more quickly in AI training and AI inference workloads.
The Blackwell-based GB200 NVL72 system can perform AI inference at 30-times the speed of the equivalent H100 system. A single GB200 GPU within the NVL72 system sells for around $83,000, and while that's roughly double the price of an H100, the thirtyfold increase in inference performance translates into significant cost savings for any company deploying AI.
Microsoft is reportedly the largest buyer of Blackwell GPUs so far. It will use them to develop AI for its own purposes, but it will also rent the computing capacity to other AI developers for a fee through its Azure cloud platform. Amazon Web Services, Alphabet's Google Cloud, and Oracle are likely to be large Blackwell customers for the same reasons.
Nvidia shipped 13,000 Blackwell samples to customers during its recent fiscal 2025 third quarter (ended Oct. 27), but Jensen Huang says demand is "staggering," so that number will probably grow rapidly. Morgan Stanley predicts Nvidia will ship up to 300,000 units in the last three months of calendar 2024, followed by 800,000 units in the first three months of 2025.
Nvidia says Oracle alone is building clusters using more than 131,000 Blackwell GPUs, and since Oracle is spending far less on AI capital expenditures (capex) than cloud providers like Microsoft and Amazon, Morgan Stanley's estimates probably aren't far off the mark.
Nvidia's growth is forecast to slow, but it's still lightning fast
Nvidia's fiscal year is different from the regular calendar year. As I mentioned above, the company is currently in fiscal 2025, which will end less than two months from now on January 30, 2025. As a result, Nvidia's fiscal year 2026 will occupy most of the calendar year 2025.
Nvidia's revenue is on track to come in at a record $128.7 billion for fiscal 2025 (according to the company's guidance), which would be a whopping 111% increase from fiscal 2024. At least 80% of that revenue will come from its data center segment, which is where sales of AI GPUs like the H100 and GB200 are accounted for.
Wall Street's consensus estimate for fiscal 2026 (provided by Yahoo) suggests Nvidia's total revenue will climb to another record high of $195.4 billion. You will notice that represents growth of 51%, which would be less than half the growth rates from fiscal 2024 and fiscal 2025.
Nvidia's revenue numbers are becoming so large that it's impossible for the company to maintain triple-digit percentage increases. It isn't necessarily a bad thing, especially because AI infrastructure spending is forecast to increase significantly from here.
As I highlighted earlier, Jensen Huang predicts there will be $1 trillion in AI infrastructure spending over the next four years. Well, Morgan Stanley believes just four companies -- Microsoft, Amazon, Alphabet, and Meta Platforms -- will spend a combined $300 billion in 2025 alone. That doesn't include other big spenders like Oracle, OpenAI, or even Tesla.
Therefore, Huang's estimate might be too conservative.
Nvidia has a mathematical path to over $200 per share in 2025
Nvidia's dominant market share in the data center GPU space affords it a significant amount of pricing power. In other words, demand is so strong that it can charge extremely high prices, which is boosting the company's profit margins.
That's why Nvidia's earnings per share (EPS) soared by 103% in the recent third quarter. Based on the company's trailing 12-month EPS of $2.62, its stock trades at a price-to-earnings (P/E) ratio of 56.1 as of this writing.
That sounds expensive at face value because the Nasdaq-100 technology index trades at a P/E ratio of just 33.9. However, Nvidia's average P/E ratio over the last 10 years is 58.6, so it you could argue the stock is actually cheap right now:
Looking ahead, Wall Street's consensus estimate suggests Nvidia's EPS could come in at $4.43 in fiscal 2026. That places the stock at a forward P/E ratio of just 32.1. That means the stock will have to soar 82% next year just to trade in line with its 10-year average P/E of 58.6, which implies a stock price of $259!
I can't believe I'm saying this because Blackwell shipments are only just ramping up, but one Wall Street analyst thinks Nvidia's next GPU architecture (called Rubin) could be six months ahead of schedule. That might be yet another upside catalyst for the stock if Nvidia reveals more information over the next 12 months.