2024 could be dubbed the year of artificial intelligence (AI), as the latest advances involving generative AI made their way into mainstream usage and AI-associated stocks helped power the market to new heights. Some of the biggest AI winners have been in the semiconductor sector, where AI chips provide the computing power needed to train large language models and run AI inference.

That said, spending on AI infrastructure looks like it will only ramp up in 2025, with Microsoft recently announcing it would spend a massive $80 billion building out AI data centers this year. Other megacap tech companies, such as Alphabet, Amazon, and Meta Platforms, have also indicated they will increase their AI infrastructure spending this year.

Let's look at three AI chipmakers that could benefit from this spending push.

1. Nvidia

Nvidia (NVDA -0.02%) has become the king of AI infrastructure and that helped it grow to become one of the biggest companies in the world (as measured by market cap). Nvidia makes graphic processing units (GPUs), which as the name implies were originally designed to facilitate high-level graphics rendering, particularly in video games. However, the company later designed a free software program called CUDA to allow developers to program its chips for other purposes. While it created CUDA to help sell more of its chips, the program eventually became the standard upon which developers were trained to program GPUs.

Since then it has added a number of developer tools and micro libraries specifically for AI that have made the software indispensable and helped create a big moat for the company. As a result, it now controls about a 90% market share in the GPU space.

As large tech companies continue to race to build new AI models, they have needed more and more computing power. More often than not, this comes from Nvidia's GPUs. Nvidia's largest customer is reportedly Microsoft, so its announcement that it would greatly increase its spending on AI data centers will undoubtedly be a big growth driver for the company in 2025.

Trading at a forward price-to-earnings ratio (P/E) of just over 31 times based on next year's analyst estimates and a price/earnings-to-growth ratio (PEG) of 0.98, the stock remains attractively valued. A PEG under 1 is generally considered undervalued, and growth stocks will often have PEGs well above 1.

Artist rendering of AI chip.

Image source: Getty Images. 

2. Advanced Micro Devices

Advanced Micro Devices (AMD -4.31%) is the distant No. 2 player in the GPU space with about a 10% market share. Nonetheless, the company still benefits from the huge AI data center buildout going on. Last quarter, its data center segment revenue surged 122% year over year to $3.5 billion led by sales of its Instinct GPUs and EPYC CPUs (central processing units).

The company continually has raised its data center GPU revenue forecast for 2024. It originally expected data center GPU revenue of $2 billion, but last quarter it guided for it to exceed $5 billion. It noted that Microsoft, Meta Platforms, and Oracle are all using its MI300X GPUs. While its GPUs are used for training, the company has found more of a niche in AI inference, where its customers deploy its GPUs for narrow, well-defined use cases, according to SemiAnalysis.

While AMD is riding the GPU wave as a distant secondary option, one area where it has been seeing strength is with server CPUs. Last quarter, it said it continued to gain market share with server CPUs, as cloud computing companies keep expanding the usage of its EPYC CPUs across their data center infrastructure.

With its pending acquisition of ZT Systems, AMD will also look to become a data center end-to-end solution provider, as ZT designs and builds server equipment for data centers.

The stock trades at a forward P/E of only 17 times, making it a solid investment option to consider.

3. Broadcom

While Nvidia and AMD make mass-market GPUs, Broadcom (AVGO 0.29%) is helping customers design custom AI chips. Customized chips, called ASICs (application-specific integrated circuits) are designed for very specific tasks, and thus they tend to offer better performance and require less energy consumption to perform those tasks than GPUs. On the downside, these chips don't have the flexibility of GPUs and the designs take time and are only applicable to single customers.

However, Broadcom's custom chips have been gaining traction. Alphabet was its first big customer, with Broadcom helping it develop its tensor processing units (TPUs). Alphabet has credited using its TPUs along with GPUs as a differentiator that lowers costs and reduces inference processing times. Since then, it has added several other large customers that are believed to be Meta Platforms, ByteDance (owner of TikTok), OpenAI, and Apple.

The company got investors excited last quarter when it said that its three largest "hyperscaler" customers (those owning massive data centers) could deploy up to 1 million AI chips each in 2027, representing a $60 billion to $90 billion opportunity when including networking equipment. Broadcom makes specialty networking chips that help all these AI chips communicate with each other so they work more efficiently as a group. The opportunity could be even larger depending on how quickly its two new custom chip customers progress in their development.

The stock currently trades at 36 times fiscal 2025 analyst earnings estimates (ending in October), which is more expensive than Nvidia and AMD. However, some of its other chip businesses are at cyclical troughs, while it has a pretty large opportunity in front of it with customer AI chips.