According to Morgan Stanley, a group of four tech giants (Microsoft, Amazon, Alphabet, and Meta Platforms) could spend a combined $300 billion building data center infrastructure for artificial intelligence (AI) development during 2025.
Nvidia (NVDA -2.91%) owns a dominant share of the market for AI data center chips, so it could be one of the biggest beneficiaries of that spending. But it won't be the only winner -- Advanced Micro Devices (AMD 0.34%) has an impressive lineup of AI hardware products rolling out this year, and Micron Technology (MU -4.78%) is one of Nvidia's key component suppliers.
Here's why investors could do well by adding all three stocks to their portfolio.
The case for Nvidia
Developing AI models requires a substantial amount of computing power, and Nvidia's graphics processors (GPUs) are the most popular chips for that purpose. Tech giants are racing to fill their data centers with as many of them as possible, not only to deploy their own AI software, but also to rent the computing capacity to other AI developers for a profit.
Nvidia's H100 was the best AI GPU in the industry during 2023, and for most of 2024 until the newer H200 started shipping. But the company launched a new lineup of GPUs last year based on its more advanced Blackwell architecture, which offers a substantial leap in performance. The Blackwell GB200 NVL72 system, for example, can perform AI inference 30 times faster than the equivalent H100 system, paving the way for the most advanced AI models to date.
Nvidia started sending GB200 samples to customers at the end of 2024, but shipments are expected to scale rapidly this year. In fact, Blackwell revenue could surpass Hopper revenue (the previous architecture which powers the H100 and H200) as soon as April.
Nvidia's fiscal year 2025 will wrap up at the end of this month, and the company is on track to deliver a record $128.6 billion in total revenue, representing 112% growth compared to fiscal 2024. If recent quarters are any indication, around 88% of that revenue will be attributable to the company's data center segment, where it accounts for AI GPU sales.
Investors will be focused on Blackwell sales during this calendar year, but reports have emerged that development of Nvidia's next-generation architecture, "Rubin," is six months ahead of schedule. That means the company could preview an entirely new set of GPUs before the end of the year, which will give investors some insight into potential revenue growth for calendar 2026 and beyond.
Despite the 700% gain in Nvidia stock over the past two years, it still looks cheap, so it probably isn't too late for investors to add it to their portfolio.
The case for Advanced Micro Devices (AMD)
AMD supplies processors for some of the most popular consumer electronics in the world, from Sony's PlayStation 5, to the infotainment systems inside Tesla's electric vehicles. However, the company has also become a competitor to Nvidia in the data center.
Last year, AMD started shipping its MI300X AI GPU which was designed to compete with the H100. It has attracted many of Nvidia's top customers like Oracle and Microsoft, some of which are yielding lower costs and better performance by using it over the H100. AMD built on that success by recently launching the newer MI325X, but investors are looking ahead to the release of the MI350 series.
The MI350 is built on AMD's new Compute DNA (CDNA) 4 architecture, which is designed to compete directly with Blackwell. It's expected to deliver an eye-popping performance increase of 35 times compared to the MI300X. The new GPU is expected to ship in the second half of 2025, so it's quite a ways behind Nvidia's GB200, but its incredible output should make it an attractive piece of hardware nonetheless.
AMD will report its fiscal year 2024 financial results later this month. CEO Lisa Su went into the year expecting to see $2 billion worth of AI GPU sales, but her most recent forecast suggests that figure could now exceed $5 billion. During the third quarter (ended Sept. 28) alone, AMD's data center revenue soared by 122% compared to the year-ago period, so GPU sales are ramping up fast.
But it gets better, because AMD is also a leading supplier of AI chips for personal computers (PCs). This could be a major growth opportunity as AI workloads begin shifting from data centers to devices, paving the way for faster user experiences. AMD's new Ryzen AI 300 Series chips deliver industry-leading performance, and the company expects more than 100 computing platforms to use them by the end of 2025, from top PC manufacturers like Microsoft, HP, Lenovo, and more.
AMD has a long way to go before it's an AI juggernaut like Nvidia, but its stock looks like a great value right now. Based on Wall Street's estimate for the company's fiscal 2025 earnings, its stock trades at a forward price-to-earnings (P/E) ratio of just 16.7, which makes it far cheaper than Nvidia:
The case for Micron
Micron is a top supplier of memory and storage chips, which aren't quite as glamorous as the GPUs from Nvidia and AMD, but they are becoming just as important in AI workloads. Memory chips, for example, store information in a ready state where it can be retrieved by the GPU for processing at any moment. This is especially critical when performing AI inference, because it can drive faster responses for users of chatbot applications.
Micron's HBM3E (high-bandwidth memory) solution for data centers is the best in the industry, delivering 50% more capacity while consuming 30% less energy than competing hardware. Micron's HBM3E is so good that it's already completely sold out until 2026 -- partly because Nvidia is using it in the Blackwell GB200.
The market for data center HBM was worth around $16 billion in 2024, but Micron predicts it will grow to $100 billion by 2030. The company is already working on a HBM4E solution to stay ahead of the competition, and it will deliver a 50% increase in performance over HBM3E.
Micron generated $4.4 billion in data center revenue during its fiscal 2025 first quarter (ended Nov. 28, 2024), which was a monumental 400% increase from the year-ago period. It was also the first time the company's data center business represented more than half of its total revenue (which was $8.7 billion for the quarter).
Wall Street's consensus estimate (provided by Yahoo) suggests Micron could deliver $8.90 in earnings per share during the current fiscal year 2025, which places its stock at a forward P/E ratio of just 11.1. In other words, it's significantly cheaper than AMD and Nvidia.
If Nvidia sells a significant number of GB200 GPUs this year, then Micron will sell substantial volumes of HBM3E. Therefore, it doesn't make much sense for Micron stock to trade at such a steep discount to Nvidia stock, so investors who buy it today might be scooping up a bargain.