3. NVIDIA Corporation (NASDAQ:NVDA)
Number of Hedge Fund Investors: 179
NVIDIA Corporation (NASDAQ:NVDA)’s GPUs are essential for gaming, data centers, and AI, making the company well-positioned to benefit from the rising demand for high-performance computing. NVIDIA Corporation (NASDAQ:NVDA)’s strong presence in AI is a major growth factor, as its GPUs play a crucial role in training AI models. With AI rapidly expanding across various industries, NVIDIA Corporation (NASDAQ:NVDA) stands to gain from the increased need for its products.
Additionally, NVIDIA Corporation (NASDAQ:NVDA)’s AI platforms, like NVIDIA AI Enterprise and Omniverse, further establish its leadership in this field. NVIDIA Corporation (NASDAQ:NVDA)’s move into the automotive sector with its NVIDIA DRIVE platform for autonomous vehicles offers another promising growth opportunity. By collaborating with top automotive manufacturers and investing in autonomous driving technology, NVIDIA Corporation (NASDAQ:NVDA) is set to thrive in this emerging market.
Financially, NVIDIA Corporation (NASDAQ:NVDA) has demonstrated impressive revenue and profit growth, driven by its strengths in GPUs and AI, and its expansion into automotive and data centers. FQ2 2025 Revenue reached $30 billion, marking a 15% increase from the previous quarter and a 122% increase from the previous year, surpassing the expected $28 billion. Data center revenue hit a record $26.3 billion, up 16% from the previous quarter and 154% from the previous year, driven by high demand for NVIDIA Hopper, GPU computing, and networking platforms.
Here’s what the CFO of NVIDIA Corporation (NASDAQ:NVDA), Colette Kress, has to say in their latest FQ2 2025 earnings call:
“Q2 was another record quarter. Revenue of $30 billion was up 15% sequentially and up 122% year-on-year and well above our outlook of $28 billion. Starting with data center, data center revenue of $26.3 billion was a record, up 16% sequentially and up 154% year-on-year, driven by strong demand for NVIDIA Hopper, GPU computing, and our networking platforms. Compute revenue grew more than 2.5 times, networking revenue grew more than 2 times from the last year. Cloud service providers represented roughly 45% for our data center revenue and more than 50% stemmed from the consumer, Internet, and enterprise companies. Customers continue to accelerate their Hopper architecture purchases, while gearing up to adopt Blackwell.
Key workloads driving our data center growth include generative AI, model training, and inferencing. Video, image, and text data pre and post-processing with CUDA and AI workloads, synthetic data generation, AI-powered recommender systems, SQL, and vector database processing as well. Next-generation models will require 10 to 20 times more compute to train with significantly more data. The trend is expected to continue. Over the trailing four quarters, we estimate that inference drove more than 40% of our data center revenue. CSPs, consumer Internet companies, and enterprises benefit from the incredible throughput and efficiency of NVIDIA’s inference platform. Demand for NVIDIA is coming from frontier model makers, consumer Internet services, and tens of thousands of companies and startups building generative AI applications for consumers, advertising, education, enterprise and healthcare, and robotics.
Developers desire NVIDIA’s rich ecosystem and availability in every cloud. CSPs appreciate the broad adoption of NVIDIA and are growing their NVIDIA capacity given the high demand. NVIDIA H200 platform began ramping in Q2, shipping to large CSPs, consumer Internet, and enterprise company. The NVIDIA H200 builds upon the strength of our Hopper architecture and offering, over 40% more memory bandwidth compared to the H100. Our data center revenue in China grew sequentially in Q2 and is a significant contributor to our data center revenue. As a percentage of total data center revenue, it remains below levels seen prior to the imposition of export controls. We continue to expect the China market to be very competitive going-forward. The latest round of MLPerf inference benchmarks highlighted NVIDIA’s inference leadership with both NVIDIA, Hopper, and Blackwell platforms combining to win gold medals on all tasks.”