Advanced Micro Devices, Inc. (NASDAQ:AMD) Q1 2024 Earnings Call Transcript April 30, 2024
Advanced Micro Devices, Inc. beats earnings expectations. Reported EPS is $0.62, expectations were $0.6. AMD isn’t one of the 30 most popular stocks among hedge funds at the end of the third quarter (see the details here).
Operator: Greetings and welcome to the AMD First Quarter 2024 Conference Call. At this time, all participants are in a listen-only mode. A brief question-and-answer session will follow the formal presentation. [Operator Instructions] And as a reminder, this conference is being recorded. It is now my pleasure to introduce to you host Mitch Haws, Vice President, Investor Relations. Thank you, Mitch. You may begin.
Mitch Haws: Thank you and welcome to AMD’s first quarter 2024 financial results conference call. By now you should have had the opportunity to review a copy of our earnings press release and the accompanying slides. If you’ve not had the chance to review these materials, they can be found on the Investor Relations page of amd.com. We will refer primarily to non-GAAP financial measures during today’s call and the full non-GAAP to GAAP reconciliations are available in today’s press release and the slides posted on our website. Participants on today’s call are Dr. Lisa Su, our Chair and Chief Executive Officer; and Jean Hu, our Executive Vice President, Chief Financial Officer and Treasurer. This is a live call and will be replayed via webcast on our website.
Before we begin, I would like to note that Mark Papermaster, Executive Vice President and Chief Technology Officer will attend the TD Cowen Technology Media and Telecom Conference on May 29; and Jean Hu, Executive Vice President, Chief Financial Officer and Treasurer will attend the JP Morgan Global Media and Communications Conference on Tuesday, May 21; the Bank of America Global Technology Conference on Wednesday, June 5, and the Jeffries Nasdaq Investor Conference on Tuesday, June 11. Today’s discussion contains forward-looking statements based on current beliefs, assumptions and expectations, speak only as of today and as such, involve risks and uncertainties that could cause actual results to differ materially from our current expectations.
Please refer to the cautionary statement in our press release for more information on the factors that could cause actual results to differ materially. With that, I’ll hand the call over to Lisa. Lisa?
Lisa Su: Thank you, Mitch, and good afternoon to all those listening today. This is an incredibly exciting time for the industry as the widespread deployment of AI is driving demand for significantly more compute across a broad range of markets. Under this backdrop, we are executing very well as we ramp our data center business and enable AI capabilities across our product portfolio. Looking at the first quarter, revenue increased to $5.5 billion. We expanded gross margin by more than 2 percentage points and increased profitability as data center and client segment sales each grew by more than 80% year-over-year. Data center segment revenue grew 80% year-over-year and 2% sequentially to a record $2.3 billion. The substantial year-over-year growth was driven by the strong ramp of AMD Instinct MI300X GPU shipments and a double-digit percentage increase in server CPU sales.
We believe we gained server CPU revenue share in the seasonally downed first quarter, led by growth in enterprise adoption and expanded cloud deployments. In cloud, while the overall demand environment remained mixed, hyperscalers continued adopting fourth-gen EPYC processors to power more of their internal workloads and public instances. There are now nearly 900 AMD-powered public instances available globally, as Amazon, Microsoft, and Google all increase their fourth-gen EPYC processor offerings with new instances and regional deployments. In the enterprise we have seen signs of improving demand as CIOs need to add more general purpose and AI compute capacity, while maintaining the physical footprint and power needs of their current infrastructure.
This scenario aligns perfectly with the value proposition of our EPYC processors. Given our high core count and energy efficiency, we can deliver the same amount of compute with 45% fewer servers, compared to the competition, cutting initial CapEx by up to half, and lowering annual OpEx by more than 40%. As a result, enterprise adoption of EPYC CPUs is accelerating, highlighted by deployments with large enterprises, including American Airlines, DBS, Emirates Bank, Shell, and STMicro. We’re also building momentum with AMD-powered solutions powering the most popular ERP and database applications. As one example, the latest generation of Oracle Exadata, the leading database solution used by 76 of the Fortune 100 is now powered exclusively by fourth-gen EPYC processors.
Looking ahead, we’re very excited about our next-gen Turin family of EPYC processors featuring our Gen 5 core. We’re widely sampling Turin, and the silicon is looking great. In the cloud, the significant performance and efficiency increases of Turin position us well to capture an even larger share of both first and third-party workloads. In addition, there are 30% more Turin platforms in development from our server partners, compared to fourth-gen EPYC platforms, increasing our enterprise and with new solutions optimized for additional workloads. Turin remains on track to launch later this year. Turning to our broader data center portfolio, we delivered our second straight quarter of record data center GPU revenue as MI300 became the fastest ramping product in AMD history, passing $1 billion in total sales in less than two quarters.
In cloud, MI300x production deployments expanded at Microsoft, Meta, and Oracle to power generative AI training and inferencing for both internal workloads and a broad set of public offerings. For the enterprise, we’re working very closely with Dell, HPE, Lenovo, Supermicro, and others as multiple MI300x platforms enter volume production this quarter. In addition, we have more than 100 enterprise and AI customers actively developing or deploying MI300x. On the AI software front, we made excellent progress adding upstream support for AMD hardware in the OpenAI Triton compiler, making it even easier to develop highly performing AI software for AMD platforms. We also released a major update to our ROCm software stack that expands support for open source libraries including VLLM and frameworks including Jax, adds new features like video decode and significantly increases Generative AI performance by integrating advanced attention algorithm support for sparsity and FPA.
Our partners are seeing very strong performance in their AI workloads. As we jointly optimize for their models MI300x GPUs are delivering leadership inferencing performance and substantial TCO advantages, compared to H100. For instance, several of our partners are seeing significant increases in tokens per second when running their flagship LLMs on MI300x, compared to H100. We’re also continuing to enable the broad ecosystem required to power the next generation of AI systems including as a founding member of the Ultra Ethernet consortium working to optimize the widely adopted Ethernet protocol to run AI workloads at data center scale. MI300 demand continues to strengthen. And based on our expanding customer engagements, we netback data center GPU revenue to exceed $4 billion in 2024, up from the $3.5 billion we guided in January.
Long term, we are increasingly working closer with our cloud and enterprise customers as we expand and accelerate our AI hardware and software road maps and grow our data center GPU footprint. Turning to our client segment. Revenue was $1.4 billion, an increase of 85% year-over-year, driven by strong demand for our latest generation resin mobile and desktop processors with OEMs and in the channel. Client segment revenue declined 6% essentially. We saw strong demand for our latest generation Ryzen processors in the first quarter. Ryzen desktop CPU sales grew by a strong double-digit percentage year-over-year and Ryzen mobile CPU sales nearly doubled year-over-year as new Ryzen 80-40 notebook designs from Acer, Asus, HP, Lenovo and others ramped.
We expanded our portfolio of leadership enterprise PC offerings with the launch of our Ryzen Pro 8000 processors earlier this month. Ryzen Pro 80-40 mobile CPUs delivered industry-leading performance in battery life for commercial notebooks. And our Ryzen Pro 8000 series desktop CPUs are the first processor to offer dedicated, on-chip AI accelerators in commercial desktop PCs. We see clear opportunities to gain additional commercial PC share based on the performance and efficiency advantages of our Ryzen Pro portfolio and an expanded set of AMD-powered commercial PCs from our OEM partners. Looking forward, we believe the market is on track to return to annual growth in 2024, driven by the start of an enterprise refresh cycle and AI PC adoption.
We see AI as the biggest inflection point in PC since the Internet with the ability to deliver unprecedented productivity and usability gains. We’re working very closely with Microsoft and a broad ecosystem of partners to enable the next generation of AI experiences powered by Ryzen processors, with more than 150 ISVs on track to be developing for AMD AI PCs by the end of the year. We will also take the next major step in our AI PC road map later this year with the launch of our next-generation Ryzen mobile processors code named Strix. Customer interest in Strix is very high based on the significant performance and energy efficiency uplifts we are delivering. Design win momentum for premium notebooks is outpacing prior generations as Strix enables next-generation AI experiences in laptops that are thinner, lighter and faster than ever before.
We’re excited about the growth opportunities for the PC market. And based on the strength of our Ryzen CPU portfolio, we expect to grow revenue share this year. Now turning to our Gaming segment. Revenue declined 48% year-over-year and 33% sequentially to $922 million. First quarter semi-custom SoC sales declined in line with our projections as we are now in the fifth year of the console cycle. In Gaming Graphics, revenue declined year-over-year and sequentially. We expanded our Radeon 7000 Series family with the global launch of our Radeon RX 7900 GRE and also introduced our driver-based AMD fluid motion frames technology that can provide large performance increases in 1,000s of games. Turning to our Embedded segment. Revenue decreased 46% year-over-year and 20% sequentially to $846 million as customers remain focused on normalizing their inventory levels.
We launched our Spartan Ultrascale+ FPGA family with high I/O counts, power efficiency and state-of-the-art security features, and we’re seeing a strong pipeline of growth for our cost optimized embedded portfolio across multiple markets. Given the current embedded market conditions, we’re now expecting second quarter embedded segment revenue to be flat sequentially with a gradual recovery in the second half of the year. Longer term, we see AI at the edge as a large growth opportunity that will drive increased demand for compute across a wide range of devices. To address this demand, we announced our second generation of Versal adaptive SoCs that deliver a 3 times increase in AI tops per watt and a 10 times greater scaler compute performance compared to our prior generation of industry-leading adaptive SoCs. Versal Gen 2 adaptive SoCs are the only solution that combine multiple compute engines to handle AI preprocessing, inferencing and post processing on a single chip, enabling customers to rapidly add highly performant and efficient AI capabilities to a broad range of products.
We were pleased to be joined at our launch by Subaru, who announced they adopted Versal AI Edge series Gen 2 devices to power the next generation of their iSIGHT ADAS system. Embedded Design win momentum remains very strong as customers adopt our full portfolio of PGAs, CPUs, GPUs and adaptive SoCs to address a larger portion of their compute needs. In summary, we executed well in the first quarter, setting us up to deliver strong annual revenue growth and expanded gross margin driven by growing adoption of our instinct, EPYC and Ryzen product portfolios. Our priorities for 2024 are very clear: accelerate our data center growth by ramping instinct GPU production and gaining share with our EPYC processors. Launch our next-generation Zen 5 PC and server processors that extend our leadership performance and expand our adaptive computing portfolio with differentiated solutions.
Looking further ahead, AI represents an unprecedented opportunity for AMD. While there has been significant growth in AI infrastructure build-outs, we are still in the very early stages of what we believe is going to be a period of sustained growth driven by an insatiable demand for both high-performance AI and purpose compute. We have expanded our investments across the company to capture this large growth opportunity from rapidly expanding our AI software stack to accelerating our AI hardware road maps, increasing our go-to-market entities and partnering closely with the largest AI companies to co-optimize solutions for their most important workloads. We are very excited about the trajectory of the business and the significant growth opportunities ahead.
Now I’d like to turn the call over to Jean to provide some additional color on our first quarter results. Jean?
Jean Hu: Thank you, Lisa, and good afternoon, everyone. I’ll start with a review of our financial results and then provide our current outlook for the second quarter of fiscal 2024. We delivered strong year-over-year revenue growth in our Data Center and Client segments in the fourth quarter and grew 230 basis points of gross margin expansion. For the first quarter of 2024, revenue was $5.5 billion, up 2% year-over-year as revenue growth in the Data Center and the Client segment was partially offset by lower revenue in our Gaming and Embedded segment. Revenue declined 11% sequentially as higher data center revenue resulting from the ramp of our AMD Instinct GPUs was offset by lower gaming and Embedded segment revenues. Gross margin was 52%, up 230 basis points year-over-year, driven by higher revenue contribution from the Data Center and Client segment, partially offset by lower Embedded and Gaming segment revenue contribution.
Operating expenses were $1.7 billion, an increase of 10% year-over-year as we continued investing aggressively in R&D and marketing activities to address the significant AI growth opportunities ahead of us. Operating income was $1.1 billion, representing a 21% operating margin. Taxes, interest expense and other was $120 million. For the fourth quarter of 2024, diluted earnings per share was $0.62, an increase of 3% year-over-year. Now turning to our Reportable segment, starting with the data center. Data Center delivered record quarterly segment revenue of $2.3 billion, up 80%, a $1 billion increase year-over-year. Data Center accounted for more than 40% of total revenue, primarily led by the ramp of AMD Instinct GPUs from both cloud and enterprise customers and a strong double-digit percentage growth in our server process revenue as a result of growth across our sample products.
On a sequential basis, revenue increased 2%, driven by the ramp of our AMD Instinct GPUs, partially offset by seasonal decline in server CPU sales. Data center segment operating income was $541 million or 23% of revenue, compared to $148 million or 11% a year ago. Operating income was up 266% year-over-year, due to operating leverage even as we significantly increased our investment in R&D. Client segment revenue was $1.4 billion, up 85% year-over-year, driven primarily by Ryzen 8000 series processors. On a sequential basis, Client revenue declined 6%. Client segment operating income was $86 million or 6% of revenue compared to an operating loss of $172 million a year ago driven by higher revenue. Gaming segment revenue was $922 million, down 48% year-over-year and down 33% sequentially due to a decrease in semi customer and Radeon GPU sales.
Gaming segment operating income was $151 million or 16% of revenue, compared to $314 million or 18% a year ago. Embedded segment revenue was $846 million, down 46% year-over-year and 20% sequentially as customers continue to manage their inventory levels. Embedded segment, operating income was $342 million or 41% of revenue, compared to $798 million or 51% a year ago. Turning to the balance sheet and cash flow. During the quarter, we generated $521 million in cash from operations, and free cash flow was $379 million. Inventory increased sequentially by $301 million to $4.7 billion, primarily to support the continued ramp of data center and client products in advanced process node. At the end of the quarter, cash, cash equivalent and short-term investment was $6 billion.
As a reminder, we have $750 million of debt maturing this June. Given our ample liquidity, we plan to retire that utilizing existing cash. Now turning to our second quarter 2024 outlook. We expect revenue to be approximately $5.7 billion plus or minus $300 million. Sequentially, we expect data center segment revenue to increase by double-digit percentage, primarily driven by the data center GPU ramp. Client segment revenue to increase. Embedded segment revenue to be flat. And in the Gaming segment, based on current demand signals, revenue to decline by significant double-digit percentage. Year-over-year, we expect our Data Center and Client segment revenue to be up significantly driven by the strength of our product portfolio. The Embedded and the Gaming segment revenue to decline by a significant double-digit percentage.
In addition, we expect second quarter non-GAAP gross margin to be approximately 53%. Non-GAAP operating expenses to be approximately $1.8 billion. Non-GAAP effective tax rate to be 13% and the diluted share count is expected to be approximately 1.64 billion shares. In closing, we started the year strong. We made significant progress on our strategic priorities, delivering year-over-year revenue growth in our data center and the client segment and expanded the gross margin. Looking ahead, we believe the investments we are making will position us very well to address the large AI opportunities ahead. With that, I’ll turn it back to Mitch for the Q&A session.
Mitch Haws: Thank you, Jean. Paul, we’re happy to poll the audience for questions.
See also 20 Countries with the Highest Suicide Rates in the World and 20 Fastest Growing Fintech Companies In 2024.
Q&A Session
Follow Advanced Micro Devices Inc (NASDAQ:AMD)
Follow Advanced Micro Devices Inc (NASDAQ:AMD)
Operator: Thank you. We’ll now be conducting a question-and-answer session. [Operator Instructions] Our first question is from Toshiya Hari with Goldman Sachs. Please proceed with your question.
Toshiya Hari: Hi. Thank you so much for taking the question. Lisa, my first question is on the MI300. You’re taking up the full year outlook from $3.5 billion to $4 billion. I’m curious what’s driving that incremental $500 million in revenue? Is it new customers? Is it additional bookings from existing customers? Is it more cloud? Is it more enterprise, if you can sort of provide color there, that would be helpful. And then on the supply side, there’s been headlines or chatter that CoWoS and/or HBM could be a pretty severe constraining factor for you guys. If you can speak to how you’re handling the supply side of the equation, that would be helpful, too. And then I have a quick follow-up.
Lisa Su: Great. Thank you, Toshiya, for the question. Look, the MI300 ramp is going really well. If we look at just what’s happened over the last 90 days, we’ve been working very closely with our customers to qualify MI300 in their production data centers, both from a hardware standpoint, software standpoint. So far, things are going quite well. And what we see now is just greater visibility to both current customers as well as new customers committing to MI300. So that gives us the confidence to go from $3.5 billion to $4 billion. And I view this as very much — it’s a very dynamic market, and there are lots of customers, we said on the — in the prepared remarks that we have over 100 customers that we’re engaged with in both development as well as deployment.
So overall, the ramp is going really well. As it relates to the supply chain, actually, I would say I’m very pleased with how supply has ramped. It is absolutely the fastest product ramp that we have done. It’s a very complex product, Chiplets, CoWoS, 3D integration, HBM. And so far, it’s gone extremely well. We’ve gotten great support from our partners. And so I would say, even in the quarter that we just finished, we actually did a little bit better-than-expected when we first started the quarter. I think Q2 will be another significant ramp. And we’re going to ramp supply every quarter this year. So I think the supply chain is going well. We are tight on supply. So there’s no question in the near-term that if we had more supply, we have demand for that product, and we’re going to continue to work on those elements as we go through the year.
But I think both on the demand side and the supply side, I’m very pleased with how the ramp is going.
Toshiya Hari: Thank you for all the details. And then as my follow-up, I was hoping you could speak to your Data Center GPU road map beyond the MI300. The other concern that we hear is your nearest competitor has been pretty transparent with their road map and that extends into ’25 and oftentimes ’26. So — and maybe this isn’t the right venue for you to give too much. But beyond the MI300, how should we think about your road map and your ability to compete in Data Center?
Lisa Su: Yes, sure. So look, Toshiya, when we start with the road map, I mean, we always think about it as a multiyear, multigenerational road map. So we have the follow-ons to MI300 as well as the next, next generations well in development. I think what is true is we’re getting much closer to our top AI customers, they’re actually giving us significant feedback on the road map and what we need to meet their needs. Our chiplet architecture is actually very flexible. And so that allows us to actually make changes to the road map as necessary. So we’re very confident in our ability to continue to be very competitive. Frankly, I think we’re going to get more competitive. Right now, I think MI300x is in a sweet spot for inference, very, very strong inference performance.
I see as we bring in additional products later this year into 2025, that, that will continue to be a strong spot for us. And then we’re also enhancing our training performance and our software road map to go along with it. So more details to come in the coming months, but we have a strong road map that goes through the next couple of years, and it is informed by just a lot of learning in working with our top customers.
Toshiya Hari: Appreciate it. Thank you. Sure.
Lisa Su:
Operator: Our next question is from Ross Seymore with Deutsche Bank. Please proceed with your question.
Ross Seymore: Hey, thanks let me ask a question. The non-AI side of the data center business, it sounds like the enterprise side has some good traction even though the sequential drop happened seasonally, Lisa. But I was just wondering what’s implied in your second quarter guidance for the data center CPU side of things? And generally speaking, how are you seeing that whole kind of GPU versus CPU crowding out dynamic playing out for the rest of 2024?
Lisa Su: Yes, sure, Ross, thanks for the question. I think the — our EPYC business has actually performed pretty well. The market is a bit mixed. I think some of the cloud guys are still working through sort of their optimizations. I think it’s different by customer. We did see here in the first quarter, actually some very nice early signs in the enterprise space, sort of large customers starting refresh programs. The value proposition of general is very, very strong, and we’re seeing that pull through across the enterprise. In the second quarter, we expect overall data center to be up strong double digits. And then within that, we expect server to be up as well. And as we go into the second half of the year, I think there are a couple of drivers for us.
We do expect some improvement in the overall market conditions for the server business. But we also have our Turin launch in the second-half of the year that will also, we believe, extend our leadership position within the server market. So overall, I think the business is performing well, and we believe that we’re continuing to be very well positioned to gain share throughout the year.
Ross Seymore: Thanks for that. And I guess as my follow-up, just switching over to the client side. I noted you guided it up sequentially. Any sort of magnitude around that for the second quarter and perhaps more importantly when you talk about the whole AI PC side of things. Do you believe that’s more of a unit driver for you, an ASP driver or will it be both?