Dell Technologies Inc. (NYSE:DELL) Q1 2025 Earnings Call Transcript May 30, 2024
Dell Technologies Inc. reports earnings inline with expectations. Reported EPS is $1.27 EPS, expectations were $1.27.
Operator: Good afternoon, and welcome to the Fiscal Year 2025 First Quarter Financial Results Conference Call for Dell Technologies, Inc. I’d like to inform all participants this call is being recorded at the request of Dell Technologies. This broadcast is the copyrighted property of Dell Technologies, Inc. Any rebroadcast of this information in whole or part without the prior written permission of Dell Technologies is prohibited. Following prepared remarks, we will conduct a question-and-answer session. [Operator Instructions]. I’d like to turn the call over to Rob Williams, Head of Investor Relations. Mr. Williams, you may begin.
Rob Williams: Thanks, everyone, and thanks for joining us. With me today are Jeff Clarke, Yvonne McGill and Tyler Johnson. Our earnings materials are available on our IR website, and I encourage you to review these materials and the presentation which includes content to complement our discussion this afternoon. Guidance will be covered on today’s call. During this call, unless otherwise indicated, all references to financial measures refer to non-GAAP financial measures, including non-GAAP gross margin, operating expenses, operating income, net income, diluted earnings per share and adjusted free cash flow. Reconciliation of these measures to their most directly comparable GAAP measures can be found in our web deck and our press release.
Growth percentages refer to year-over-year change, unless otherwise specified. Statements made during this call that relate to future results and events are forward-looking statements based on current expectations. Actual results and events could differ materially from those projected due to a number of risks and uncertainties, which are discussed in our web deck and our SEC filings. We assume no obligation to update our forward-looking statements. Now, I’ll turn it over to Jeff.
Jeff Clarke: Thanks, Rob, and thanks everyone for joining us. I’m really proud of the team and our strong performance in Q1. We executed well in an improving demand environment. Revenue was $22.2 billion, up 6% with exceptional growth in servers and a return to growth in our commercial PC business. Operating income was $1.5 billion, diluted EPS was $1.27 and cash flow from operations was $1 billion. We are uniquely positioned to help customers with artificial intelligence and our strong AI momentum continued in Q1. In ISG, our AI-optimized servers orders increased to $2.6 billion, with shipments up more than 100% sequentially to $1.7 billion. We have now shipped more than $3 billion of AI servers over the last three quarters.
Our AI server backlog is $3.8 billion, growing sequentially by approximately $900 million. Our AI optimized server pipeline grew quarter-over-quarter again and remains a multiple of our backlog. We’ve seen an expansion in the number of enterprise customers buying AI solutions, which remains a significant opportunity for us given we are in the early stages of AI adoption. Traditional server demand remained strong in Q1. It grew for the second consecutive quarter year-over-year and the fourth consecutive quarter sequentially. Storage demand has stabilized with revenue flat year-over-year. Moving to CSG. Commercial PC demand has also stabilized and we saw an improving demand environment as we move through the quarter. CSG revenue was flat year-over-year with healthy operating profitability.
We expect commercial PCs to continue to improve as the year progresses. We remain optimistic about the coming PC refresh cycle, driven by multiple factors. The PC install base continues to age, Windows 10 will reach end of life later next year and the industry is making significant advancements in AI-enabled architectures and applications. We will continue to focus on commercial PCs, high-end of consumer, and gaming, driving a strong attach motion, a strategy that has served us well across various economic cycles. Last week, we held our annual Dell Technologies World customer event in Las Vegas, and artificial intelligence was front and center. There was tremendous excitement around our approach to GenAI and the capabilities we’re bringing to customers and partners as they set out on their AI journey.
We highlighted our AI strategy to accelerate adoption of AI, which is built on five core beliefs. The first, data is the differentiator. 83% of all data is on-prem and 50% of data is generated at the edge. Second, AI is moving to the data because it’s more efficient, effective and secure. And AI inferencing on-prem can be 75% more cost effective than the cloud. Third, AI will be implemented in a wider range of ways, from locally on devices to massive data centers depending on the use case. Fourth, you need an open, modular architecture to support rapid and sustainable innovation. And finally, AI requires a broad and open ecosystem to take advantage of the latest advancements. We launched the Dell AI Factory to help accelerate AI innovation in an option.
It combines our Dell solutions and services optimized for AI workloads with an open ecosystem for partners, including NVIDIA, Meta, Microsoft and Hugging Face. The Dell AI Factory is the industry’s broadest AI-optimized portfolio of solutions and services that can be designed and sized to meet the specific requirements of our customers. We also extended our engineering leadership with new features and capabilities across our portfolio, including the new PowerEdge XE9680L, an 8-way GPU server with 12 Gen5 PCIe slots and direct-to-chip liquid cooling that improves overall power efficiency by 2.5x. With its 4U form factor, customers can buy the densest rackable architecture in the industry with up to 9 XE9680Ls and 72 high-wattage GPUs in one rack that can support performance up to 130 kilowatts straight from our factory.
In storage, significant software updates to our power scale, which now includes the availability of QLC. Our PowerStore software updates give new and existing customers up to a 66% performance boost and native sync replication for filing block. We also added the PowerScale F910 to our AI-optimized portfolio. The hardware and software capabilities are 2x faster write performance than our competition. We also broadened our networking portfolio with the new PowerSwitch Z9864 with 51 terabits per second of throughput for AI workloads. In client, we announced Dell’s next-generation AI PCs, the Qualcomm Snapdragon X Elite powers our upcoming XPS and Latitude 7455 available in June. These 45-plus TOPS PCs can support 13 billion-plus parameter models which means customers can run popular models like Llama3 directly on their PCs. In closing, our strategy remains simple.
We are leveraging our strengths to extend our leadership positions and lean into new opportunities like AI driving incremental growth. We are giving customers more choices, flexibility and control of how and where they build, train, run and deploy AI closer to where the majority of their data exists, on-prem, increasingly at the edge, on-device, and across clouds to help them accelerate AI adoption. We are partnering with customers on every step of their AI journey to make it easy with an open, modular solution and a broad ecosystem of AI models, tools and partnerships. We are applying AI internally across our own business for better customer and team member experiences. And now over to Yvonne for the financials.
Yvonne McGill: Thanks, Jeff. In Q1, we again demonstrated our ability to execute and deliver strong cash flow, and our traditional business has stabilized with AI continuing to drive new growth. Our combined CSG and ISG revenue grew 8%, and our total revenue was $22.2 billion, up 6%. Growth margin was $4.9 billion and 22.2% of revenue, down 250 basis points, given a more competitive pricing environment and a higher AI-optimized server mix. Operating expense was $3.5 billion or 15.6% of revenue, down 3% and we will continue to be prudent in our cost management. Operating income was 6.6% of revenue and $1.5 billion, down 8%, driven by the decrease in our gross margin, partially offset by OpEx scaling. Q1 net income was $923 million down 4%, primarily driven by lower operating income.
Diluted EPS was $1.27, down 3%. ISG revenue was $9.2 billion, up 22%. Server and networking revenue was a record $5.5 billion, up 42%. Server demand was even stronger with growth across traditional and AI and our AI-optimized mix of server demand increased again sequentially. We delivered storage revenue of $3.8 billion, flat year-over-year, with demand strength in HCI, PowerMax, PowerStore, and PowerScale. ISG operating income was 8% of revenue and $736 million, down 1%. Q1 is seasonally our lowest profitability quarter in ISG, given storage seasonality and we expect ISG operating margin to improve as the year progresses. Our CSG revenue was $12 billion, flat year-over-year. Commercial revenue was $10.2 billion, up 3%, while consumer revenue was $1.8 billion, down 15%.
CSG operating income was $732 million, or 6.1% of revenue, down primarily due to a more competitive pricing environment. The commercial PC demand environment improved as we progressed through the quarter and we remain bullish on the coming PC refresh cycle and the longer-term impact of AI on the PC market. Our Dell Financial Services originations were $1.9 billion in Q1, up 1% year-over-year, despite the exit of our VMware resale business and our sale of our consumer revolving portfolio in Q3 of last year. DFS managed assets exited the quarter at $14.2 billion. Turning to our cash flow and balance sheet, adjusted pre-cash flow was $5.5 billion on a trailing 12-month basis, well above our average of $4.8 billion over the last five years as we continue to generate strong, consistent, and predictable cash flow over time.
Q1 cash flow from operations was $1 billion, primarily driven by profitability, partially offset by our annual bonus payout. Our cash conversion cycle was negative 47 days, flat sequentially, with higher inventory related to our AI business, offset by strong collections performance. We ended our quarter with 7.3 billion in cash and investments, down $1.7 billion sequentially, primarily driven by capital returns of $1.1 billion. In Q1, we purchased 6.7 million shares of stock at an average price of $108.38 and paid a $0.45 per share dividend. Since the inception of our capital return program at the beginning of FY’23, we have now returned $8 billion, or 103% of our adjusted pre-cash flow, to shareholders through stock repurchase and dividends.
Turning to guidance, our open and modular AI-optimized server solution, coupled with a broad ecosystem of partners and our engineering and services expertise are winning with customers. We expect the AI momentum we’ve seen over the past three quarters to continue, driving incremental revenue for the year. In our core businesses, while the macro environment is still dynamic our indicators point toward a stabilization with improvement as we progress through the year. Against that backdrop, we expect Dell Technologies’ FY’25 revenue to be in the range of $93.5 billion and $97.5 billion, with a midpoint of $95.5 billion or 8% growth. We expect ISG to grow in excess of 20% fueled by AI and our CSG business to grow in the low single digits for the year.
We expect the combined ISG and CSG business to grow 11% at the midpoint and our other business to decline as previously discussed on the Q4 call. Given inflationary input costs, the competitive environment, and a higher mix of AI-optimized servers, we do expect our gross margin rate to decline roughly 150 basis points. So we expect both ISG and CSG operating margin rates to be within our long-term financial framework for the full year. We’ll continue to be disciplined in our cost structure and expect OPEX to be down low single digits for the year. We expect I&O to be roughly $1.4 billion. Diluted non-GAAP EPS is expected to be $7.65, plus or minus $0.25, up 7% at the midpoint, assuming an annual non-GAAP tax rate of 18%. For Q2 of fiscal ’25, we expect Dell Technologies revenue to be in the range of $23.5 billion and $24.5 billion, with a midpoint of $24 billion, up 5%.
We expect the combined ISG and CSG to grow 8% at the midpoint with ISG up in the mid-20s. We expect OpEx will be down roughly 3% year-over-year. Op increase is expected to improve sequentially, driven by quarter-over-quarter growth in ISG and sequential growth and margin rate expansion in our storage business. Q2 diluted share count should be between 723 million and 727 million shares. Diluted non-GAAP EPS is expected to be $1.65, plus or minus $0.10. In closing, we are optimistic about FY’25 and beyond, with a number of tailwinds, including AI and the coming IT hardware refresh cycle and no one in the industry is better positioned than Dell. Our unique operating model, with reinforcing competitive advantages, has been honed for over 40 years now and is perfectly designed for this moment to help customers accelerate their adoption of AI.
We have the broadest end-to-end portfolio of solutions, with industry-leading positions across PCs, AI-optimized servers, traditional servers and storage. We have the industry’s leading supply chain with size and scale and we have the industry’s best go-to-market engine and an unmatched direct sales force supported by a strong channel partner network, all underpinned by our world-class services organization that can support customers across the globe. It’s an exciting time to be in technology and here at Dell and we are looking forward to a bright future. Now, I’ll turn it back to Rob to begin Q&A.
Rob Williams: Thanks, Yvonne. Let’s get to Q&A. We ask that each participant ask one question to allow us to get to as many of you as possible. Let’s go to the first question.
Q&A Session
Follow Dell Inc (NASDAQ:DELL)
Follow Dell Inc (NASDAQ:DELL)
Operator: Thank you. We’ll take our first question from Krish Sankar with TD Cowen.
Krish Sankar: Hi. Thanks for taking my question. Jeff, I just wanted to find a little bit about the AI server backlog. What are the lead times there? How does it compare with the total server backlog? And can you talk a little bit about the pipeline you have for AI servers and the breadth of the pipeline? Thank you.
Jeff Clarke: Sure. I’ll take that question. Look, the lead times on our product varies and is complicated. We have product transitions from H100 to H200, outselling GB200 and B200. Those are customer allocated. So to say that there’s an average lead time for our product is very dependent on the customer, what technology we’re talking about. But in average, if it was just looking at the availability of parts, the H100 lead time is better. Those products are in full production. NVIDIA is meeting demand, and they’ll have availability of the H200 on schedule towards the second or the latter part of Q2. So it’s in production, as is the B200. So that’s where the backlogs slash lead times are. When you look at the composition of our backlog, it’s primarily NVIDIA-based.
Customers range from many enterprise customers to some large CSPs. So the composition is very much our customer composition. We’re excited about it continuing to build. It’s continuing to build every quarter. Our five-quarter pipeline continues to grow. There’s lots of projects out there. We’re out competing for each and every one of them, and we’d expect growth in both going forward.
Rob Williams: Thanks, Krish.
Krish Sankar: Thanks, Bill.
Operator: And our next question will come from Toni Sacconaghi with Bernstein.
Toni Sacconaghi: Yes, thank you for the question. If I just look year over year at the ISG business, storage was perfectly flat. AI servers went from zero to 1.7 billion, which sort of suggests that traditional servers were flat. So really the only thing that changed was you added 1.7 billion in AI servers, and operating profit was flat. So does that suggest that operating margins for AI servers were effectively zero? And if that’s not the case, how do you square the circle with what I just outlined? Thank you.
Yvonne McGill: Okay, Toni, I’ll take that one. So when I look at the overall ISG performance from an operating income standpoint, storage, I’ll start with storage, right? So if the operating income was low in storage, you know that Q1 is seasonally our lowest revenue quarter from a storage perspective. When the revenue declines, the business descales, and so we saw that evidenced in the Q1 results. And while OpEx remained unchanged to the points you’re making, the operating rates declined. In traditional servers, we saw strength in large enterprise and large bid mix, so a shift there a bit, which, as you know, that drives lower margin rates. And when I look into Q2 and FY’25, though, I tell you that we expect ISG Opinc rates to improve, as we talked about in the guide, over the year and really deliver against our long-term framework, that 11% to 14%.
So, I think what we saw in the first quarter was multifaceted, but we do continue to expect recovery as the year goes on. And those AI-optimized servers, we’ve talked about being margin rate diluted, but margin dollar accretive. And so you’ll continue to see that evidenced in the results also.
Rob Williams: Yes. Thanks for the question, Toni.
Operator: And our next question will come from Ben Reitzes with Melius Research.
Ben Reitzes: Hi, guys. Thanks a lot. Yvonne and Jeff, I want to take another stab at Toni’s question in a different way. There’s the view that you guys have said in the past that for every dollar of AI server, there’s $2 of services, storage, and other higher margin things that come. And when you see storage kind of a little below expectations in the quarter, it seems like you’re yet to get that. And a lot of us are looking at the strength in your AI servers and looking forward to that gross margin tailwind to come kind of like a razor and blade model. Do you still see that coming to fruition, Jeff and Yvonne, and how will that play out in the model? I know you don’t guide beyond this year, but it really should help next year over the three to five-year life of those contracts. So I’m just wondering if you still believe that and how you want us to keep that in perspective when we see margins like this. Thanks.
Jeff Clarke: No, Ben, look, our view of the broad opportunity hasn’t changed around each and every AI server that we sell. I think we talked last time, but maybe to revisit that, we think there’s a large amount of storage that sits around these things. These models that are being trained require lots of data. That data has got to be stored and fed into the GPU at a high bandwidth, which ties in network. The opportunity around unstructured data is immense here, and we think that opportunity continues to exist. We think the opportunity around NICs and switches and building out the fabric to connect individual GPUs to one another to take each node, racks of racks across the data center to connect it, that high bandwidth fabric is absolutely there and needed.
We think the opportunity to extend in doing deployment of the rack itself is an opportunity, installing whether that’s cables, heat exchangers, rear door heat exchangers, cooling units, power units, et cetera, the cabling. We think the deployment of this gear in the data center is a huge opportunity. And specifically around services, there’s four types of services that we’re building through our own and through our partner network. They basically hit the categories of helping our customers with their AI strategy, how they implement AI, which is all around the data and getting the data prepared to be consumed and ingested, adopting or putting the AI infrastructure in place and getting AI adopted inside an enterprise. And then lastly, how do we help them scale it?
And we look at the broad range of services that we have. We’re pretty excited about the full stack opportunities we have. A great example of that would be last week with the Dell AI factory, with the NVIDIA stack, which is a full stack opportunity with partners like Hugging Face to help customers deploy on-prem. So nothing’s really changed there. And I think if I was to take the question that Toni asked, and yours is in there as well, and I’ll let Yvonne add to this, we can do better in both our traditional servers and our storage products in terms of margins. We had a mix shift in storage that was from our Dell IP to partner IP. That impacts us. We had a customer shift and a geographic shift that pushed margins down. Our most profitable segment is Dell IP sold in North America to the largest customers.
They value the capabilities of our products and we get a return for that. That was a lower mix this past quarter. And then a traditional server business, the margins were down as well. That was really driven. I think Yvonne hit it, but I really wanted to make sure that we get this point across. We had good growth in our traditional server business, but it’s been an enterprise and that’s a competitive market in large bids. And we acquired several new large customers during the quarter. We’ll take that deal every time because we know over time, winning a new customer, we can sell the breadth of our portfolio over time. That’s exactly what we did. And we’ll continue to look for those opportunities to grow our customer base. Yvonne?
Yvonne McGill: Yeah. And I think just in addition to Jeff, what you said, I just remind everyone that we’re at the beginning of this AI business, if you will. We’re finishing up our fourth quarter and we’ve been building the balance sheet. So it’s going to take time for you to start seeing that reflected in the P&L as we’re deferring services and support associated with the transactions that we’re doing. And so again, you’ll see that build over time and start seeing it more reflected in our financial statement.
Rob Williams: Thanks for the question, Ben.
Operator: And our next question will come from Erik Woodring with Morgan Stanley.
Erik Woodring: Great, guys. Thank you so much for taking the question. I’m going to kind of hit on a similar topic that everyone has, but Yvonne, you’re talking about improving ISG operating margins through the year. Obviously, it seems like the strength and momentum you have in AI servers means that will continue to become an increasing mix of revenue. You also have commodity costs headwinds to contend with. And, and so again, I know we’ve kind of talked about this topic, but maybe on a bit more detailed level. Can you just help us understand what are the most significant factors that, we should be thinking about that would support ISG operating margin expansion as we work through the year? Is that pricing? Is that mix is that storage mix just help us understand what the most important factors there again, as we look through the year. Thanks so much.
Yvonne McGill: Sure. I think it’s all of the things that you mentioned actually, but we talked about storage already that we did not perform or mix at the level that we would have expected in the first quarter. Jeff talked about having more of our Dell IP storage solutions, and we’re expecting that as we go through the year, ISG and storage in particular, the lowest, the lowest quarter sequentially or seasonally is our first quarter. And so we’re expecting that to continue to improve as the year progresses. We will be in a growth, a year over year growth standpoint in — we’re expecting that in the second half. So when I think of ISG growth holistically that, that we’ve guided to across the year of 20%, plus obviously there’s the AI server component to that.
But it’s also in storage the expected growth in the second half, its traditional servers continuing to grow, et cetera. So it just, it’s additive as the year progresses. And that would be what we do see normally also in, in our traditional storage and I’m sorry, in our storage and our traditional server performance. And then on top of it, we’ve got the AI momentum. So that’s a great adder to the portfolio. And as I already mentioned, that balance sheet builds we’ll start to see that reflected with the services and support in our financial results.
Jeff Clarke: Maybe a couple of things to add to that. We’ve talked, I think much through this down cycle that the storage recovery lags servers. We expect the storage market to return to growth in the second half of the year. And for us outperform the marketplace. Alon mentioned Dell IP specifically, I would call out PowerStore Prime. The addition of QLC allows us to be more competitive, our performance and have a native Rex, excuse me, native sync replication, get my words correct here, allow us to be more competitive in the largest portion of the storage marketplace. And our storage margins need to improve and will improve over the course of the year. And then as we look at the traditional server marketplace, we mentioned that much of our growth was driven by enterprise. And as the rebound continues to play out, we’ll see commercial, medium business and small business, which are better margin profiles across that customer set.
Yvonne McGill: Yes. And the one thing I don’t think I called out specifically, the storage margins will continue to improve also because we will scale, right. We talked about the optics. We talked about that level of spend that we have, but as we scale that business, we will get that. And I’ll reiterate that we do expect ISP I think to finish up by 25 within our long-term framework. So 11% to 14%.
Rob Williams: All right. Next question, please.
Operator: Our next question will come from Wamsi Mohan with Bank of America.
Wamsi Mohan: Yes. Thank you so much. You mentioned the competitive pricing a few times in your prepared remarks, should we think that Dell is driving that more competitive pricing for share gains to get that incremental full stack opportunity that Jeff you alluded to or are you doing this more in response to the competitive, what the competitors are doing? And when we think about that 50 bps of incremental growth margin pressure, is that just coming from incremental AI server mix or is the, are there other moving pieces over there? Thank you.
Jeff Clarke: Wamsi, the first part of your question got flipped off. If you wouldn’t mind repeating it so Yvonne and I can make sure we answer the question that you really want us to answer.
Wamsi Mohan: Yeah, sure. Jeff. I was just wondering if like the competitive pricing that you called out, is that coming really from Dell in order to take share in the marketplace because you have this full stack opportunity that you spoke about, or is this more in response to what other competitors are doing in the market?
Jeff Clarke: So let me take a stab at that and then Yvonne can certainly layer in on top of that. So I think there’s various forms of competitiveness that we talked about. One clearly as the PC business has been in a down cycle for two years, as it’s beginning to stabilize and look for growth, it’s a competitive market out there. The consumer portion of the PC market is competitive in pricing the strong promotions that we saw through the holiday season continued into Q1. The large deals in commercial PCs are quite competitive. And as I mentioned, SB and MB are lagging in their rebound compared to large enterprise. And that’s a source of margin in both our PC business and our server business. In servers, I think we made comments that the performance has been largely driven by enterprise, large accounts, large bids, acquisition and those are competitive environments.
Again, we look at the long-term view of them and winning new customers in the data center serves us well over the course of a customer’s life cycle with us. And then specifically in AI, we’re not the price leader in AI. The engineering and the Delta we have in the capabilities of our product time and time again, we are not the low-cost provider, the low price in the AI deals that I’m involved in, which are most of them. We are getting a premium for our engineering in the marketplace. We’re getting an advantage, but these large deals when you’re talking clusters of tens of thousands, hundreds of thousands of GPs are our competitive environment, but we’re not the one running the price down. We are again, getting a premium for the value that we’ve generated or created into our products.
And then conversely, I would tell you in the smaller deals, our margins are substantially better as you would expect. So in enterprise deployments that are smaller deals, our margin rates are significantly better than they are in the very, very large opportunities. And as enterprise continues to deploy, I think that bodes well for us over the long-term. I hope that helped.
Yvonne McGill: And let me add a little bit just to specifically on that, the 50-basis point question. So that incremental gross margin headwind that we talked about of 50 basis points, it’s really two factors. One of it is, half of it is a margin impact, rate impact and then half of it’s mixed related. So we talked about, that we’re seeing more AI mix et cetera. So you’re seeing that competitive dynamic that we’ve already talked about. And also the inflationary cost environment is certainly having an impact on our gross margin rates. And given our AI server strength that we’ve seen, right, we’re shipping more than we expected. So that’s great news. And we’re but it’s leading to some margin rate solution. So I hope that was helpful.
Wamsi Mohan: Yes. Thank you so much.
Rob Williams: Next question.
Operator: Next question will come from Michael Ng with Goldman Sachs.
Michael Ng: Hi. Good afternoon. Thank you for the question. I wanted to ask about the value-added services and working capital financing and longer-term DFS financing that Dell provides to its AI server customers. Is this a point of differentiation when you think about your competitive positioning relative to ODMs or less well-capitalized competitors? And would that answer be different when you’re thinking about an AI CSP customer versus an enterprise customer? Thank you.
Yvonne McGill: So let me take a run at that. And we do have Dell Financial Services and we are seeing that being a differentiator right now in our working capital solutions that we’re providing to our customers. We’re able to put together what we’re calling payment solutions for them, which is giving them different ways to manage their capital also. And so, I think it is a differentiated advantage that we have. It’s enabling us, especially with some of these new Tier 2 hyperscalers, if you will, to leverage our capabilities and our solutions also. So both financing and from a product solution standpoint.
Michael Ng: Great. Thank you, Yvonne.
Yvonne McGill: Thank you.
Operator: And we’ll take a question from Amit Daryanani with Evercore.
Amit Daryanani: Thanks for taking my question. I guess when we think about the ISC growth of over 20% for the year, can you just talk about how much of that is AI servers versus the core ISC? If you could parse it out in a different bucket, that’d be really helpful. And then, Yvonne, could you also just clarify, the inventory was up dramatically in the quarter and it’s somewhat unusual for it to be up in this quarter. So just talk about what’s driving that and is it AI pre-builds or strategic inventory? Any help on that would be great as well. Thank you.
Yvonne McGill: Sure. So let me start with inventory, because I think that’s pretty straightforward. So inventory was up and I would say slightly, for 25 days, really representing about a $1.2 billion increase quarter over quarter. We mentioned inventory was up slightly as we ramp our AI server business. So I think it’s nothing substantial. I don’t know, Jeff, if you have anything to add on inventory.
Jeff Clarke: No, but we didn’t go out and make any strategic purchases. Some of the terms of the AI gear we need to deploy means we take ownership of it. We did and we have it in backlog. We’ll ship it as those customer orders are fulfilled. That was the driver. We weren’t out buying strategic or making strategic investments of inventory across the large component basins.
Yvonne McGill: And then, Amit in relation to your ISG question, you were asking about GPU and non-GPU servers. Is that basically the clarification?
Amit Daryanani : Yes, thinking about storage, x86 servers and then AI servers on top of it. If you just kind of give us a sense of how you think of those three buckets in the context of that over 20% growth that would be helpful.
Yvonne McGill: So, obviously, we’re having — it’s really significant growth in the GPU servers. We obviously didn’t sell those all last year. So, really, it’s even hard to have a full year compare on that since they weren’t available all last year. But that is hyper growth, as we’ve been talking about. From an overall server standpoint, xGPU servers, if you will we’re expecting somewhere in the mid-single digits for the year for server growth, revenue growth. And from a storage perspective again, we talked about returning to growth in the second half. And so, you’d see that sequential improvement as the quarters progress. So growth, sequential growth in the second quarter and then both sequential and year-over-year growth in the second half.
Amit Daryanani: Thank you very much.
Operator: And our next question comes from Simon Leopold with Raymond James.
Simon Leopold: Thank you very much for taking the question. I know, Jeff, I think you sort of alluded earlier to some of the customer aspects of the AI platforms and I guess what I want to make sure I understand here is the nature or concentration of who are the customers for your AI platforms because we know a number of years ago, Dell had moved away from selling eServer to the hyperscalers due to the limited profitability. I’d like to get an update on what the customer mix looks like today and if there’s any kind of new concentration or what the exposure is to hyperscale versus tier two. And in that context, I’m referring to companies like CoreWeave as well as sort of large enterprises. If you could help us with the bucket. Thank you.
Jeff Clarke: You know, I think we talked about this last time as well, but we talked about our mix of businesses between tier two CSPs and enterprise. I can tell you quarter over quarter the number of enterprise customers grew the percent of revenue increased and the amount of dollars we sold to enterprise customers of AI servers and AI gear increased. Is it still in the balance towards the tier two CSPs? Yes. Those are the largest opportunities. They tend to be very project based or nonlinear, if you will. Those people come in as they come in. Build out occurs, next opportunity, and then we pursue that. And we are involved in I think, every one of those opportunities that are out there. We believe the opportunity for us long term given our footprint and obviously our customer base is the deployment of AI at scale in the enterprise.
We continue to be encouraged by the growing customers. The number of customers that are repeat buyers increased quarter over quarter, indicating they are moving from more proof of concept into deploying. And then the long-term opportunity in the enterprise for us is inference, where you might recall from the remarks I made last week at DPW long term inference will be the largest use case or what I like to call AI in production and use over the course of the decade. So, the opportunity for us remains immense. We’re excited about it. Clearly, the near-term opportunity in training is with the Tier 2 CSP and the opportunity to take and build expert systems the ability to use open source models, the ability to use your own data on prem is clearly what we believe is happening.
The opportunity, and we’re there.
Simon Leopold: And just one item you didn’t mention is this concept of sovereign networks. Do you consider those Tier 2 or is that something different in your mind?
Jeff Clarke: It’s something different and I normally get the question, so I’ll anticipate it of what’s the opportunity is this going to gap out and we haven’t even scratched the surface of that. So if you think about the Tier 2 CSP opportunities, our line of sight through the end of next year remains robust. A growing enterprise customer base deploying across increasingly more use cases heading towards inference and then as you just described, Simon, the sovereign opportunities that are in their early stages of development are a tremendous opportunity.
Simon Leopold: Thank you very much.
Operator: Of course. And our next question will come from Ananda Baruah with Loop Capital.
Ananda Baruah: Hi. Thanks for taking the question, really appreciate it. I wanted to ask you a question on inference, so I’ll just pick up where you just left off. How would you describe kind of where enterprise your enterprise customers generally speaking are on their inference journey right now and when do you think it’s good for us to begin to expect your typical enterprise customer to begin to really lean into inference? And that’s it for me. Appreciate it. Thanks.
Jeff Clarke: I start back with enterprises large and small and everything in between continue to work through what their strategy is for AI. And we continue to work with them with our professional services to help them understand their strategy, what use cases are they trying to solve for. Consistently across enterprise there are six use cases that make their way to the top of almost every discussion. It’s around content creation, support assistance, natural language search, design and data creation, code generation and document automation. And helping customers understand their data, how to prepare their data for those use cases are what we’re doing today. And then once you do that, you begin to go build a set of capabilities that implement.
And how do they implement? Well, typically they’re using information retrieval systems, whether that’s RAG or vector databases or other techniques that are building these expert systems along with large language models to protect their data and to keep their data on-prem and to utilize their unique information to get the outcomes they’re looking for. And customers are aquatic. I mean, it’s a large customer base, as you would imagine. They’re in various stages of that. And we’ve just scratched the surface. The number of customers we have versus the number of customers that are buying AI gear from us, we have a lot of opportunity. And that’s what we believe the promise is long term, is that to be competitive in your respective industries, you have to deploy this technology.
I think I’ve said it publicly, it’s changing, it’s disruptive, it changes the base of competition, it drives tremendous productivity gains. There’s not a single customer engagement I’m involved in where those words don’t get mentioned. It’s how to help them get there. And we’re in various stages of the journey. And I guess that’s my long-winded way of saying we’re really early on in inference and have a big opportunity in front of us. And I’ll reemphasize again, since it’s been a subject of the call, we see in smaller deals, we’re adding greater value across our deployment services, our vast capabilities around networking and around storage, margins improved across AI deployed in those environments.
Ananda Baruah: Thank you very much. Really appreciate it.
Jeff Clarke: Of course.
Operator: And we’ll take a question from Asiya Merchant with Citigroup.
Asiya Merchant: Great. Thank you for taking my question. Regarding the backlog, how should we think about conversion of this backlog of 3.8 billion into revenues as the quarter progresses? And at some point as more chips that come on board here are we expecting this for you guys to have the supply or are you expecting there’s going to be, again, supply that’s going to be falling short here and backlog continuing to grow as a result of that in terms of lead time? Thank you.
Jeff Clarke: Let’s see if I can try to parse that. The backlog is similar to the lead time question somewhat difficult to parse because the backlog is across multiple technologies, H100, H200, B200, et cetera, even other alternatives. They have different customer commits and delivery dates depending on the priority. And the backlog clears as we go through that priority list. We’ve tried to reflect the best of our knowledge of the availability of supply versus the priority, our time through the factories and the guidance and outlook that Yvonne provided. That’s about the best I can do with where the state of the backlog is. Our job is to continue to sell more. Our five-quarter pipeline continues to look strong. It’s built again. We’re going to take POs. That means backlog builds as we wait for parts. That’s what we’ll do.
Rob Williams: Okay, nice to see you.
Operator: And we have a question from Steven Fox with Fox Advisors.
Steven Fox: Hi, good afternoon. I guess I was just curious on the inflation question on the component side and the pass-through of new GPUs within the servers. Are you seeing it develop as you thought 90 days ago or is there anything you would call out as maybe incremental pressure on the gross margins, et cetera? And any other thoughts on just the supply chain for the rest of the year? Thanks.
Jeff Clarke: Q1 was the deflationary period when we look at all of our input costs. And I think that’s the last one for the year. We expect our costs to go up in Q2, all forms of costs, our freight costs, our component costs. And then we see a step function in cost in the second half of the year driven primarily by DRAM and SSD. I’m sure you’ve done your checks, but if you look at what’s happening there, you’re seeing SSD, DRAM in the second half of the year up mid-teens to the 20 percentile quarter-on-quarter. That’s what we think happens. Every indication, the lack of capital expenditure low factory utilization and not a lot of wafer starts are going to lead to less supply than the market demand that will be out there. And the demand for AI servers and putting high bandwidth memory in these, high performing DRAM in these, and lots of storage is consuming lots of parts.
And as the backlog we just talked about gets built, we’re going to be consuming lots of material in the second half of the year. So that’s what I think happens in terms of we’re going from the last quarter of deflation to now three quarters of inflation, a step function in the second half. It does place margin pressure on us. Again, we’ve tried to reflect that to the best of our ability and our outlook. We’ve talked about this before. We generally recover about two thirds of the cost over a 90-day period. We have begun to reprice things. Part of our server recovery is we increased price not too long ago. And we’ll be looking at the cost structures of SSDs and DRAM across our entire product portfolio and adjust accordingly.
Steven Fox: Great. Thank you.
Jeff Clarke: You’re welcome.
Operator: And we have a question from Samik Chatterjee with JPMorgan.
Samik Chatterjee: Thank you for taking the question. Maybe if I change gears and sort of go to the CSG segment. You sounded more optimistic about commercial client improving through the rest of the year, maybe just share what you’re seeing or hearing from your customers in that regard. How much of this is prioritization of the replacement of the PC install base or even sort of acceleration of the replacement before the Windows end of life? And how do you — what are you hearing in terms of enterprises evaluating AI PCs in that decision-making cycle as well this year?
Jeff Clarke: Look, I’m glad you detected a bit of excitement after 2 years of decline to have the business grow again was good. Yvonne and I often joke about this, one data point does not make a trend. We need a few more dots on the old chart to see a trend that we can call a recovery, but we’re encouraged. And that’s a good sign. Some other things that we, I think, talked about that I think are worth noting is that demand improved over the course of the quarter. The pipeline grew over the course of the quarter. We haven’t had that in two years. We saw more activity in large businesses. That was encouraging. We still have an opportunity to see SB and MB pull along with that. So you can’t call a recovery until you see MB and SBs pull along with it.
But we remain encouraged in the three indicators that I called out in our remarks are what we see. Install base has never been older. Four years ago today, we were all home working remote, trying to mobilize our employee base to be remote. Those products are all four years old now. And that is typically when things get replaced and that’s built up. We’ll be the same through calendar 25 or fiscal 26. And that is a reason for belief that there is a refresh coming because the install base hasn’t been older and it needs to be updated. You have the additional component of there is end-of-life of a version of Windows. History said when there’s an end-of-life, or history is indicated, when there’s an end-of-life of an operating system, there is a buying pattern to get refreshed and updated.
And then we have the promise of what you said of AI PCs. We launched our first AI PCs much earlier this year with Meteor Lake across our entire corporate portfolio. We just announced a handful of new ones last week at Dell Technologies World. There’s more coming. The application base is building against that, whether that be what Adobe is doing, CrowdStrike is doing, what Zoom is doing, where you’re getting more performance or more capability or the announcement from Microsoft last week with Copilot Plus where you can do things like search, recall, live caption, and other exciting new capabilities that we think will drive ultimate demand for PCs. So that’s the backdrop. I’d be remiss if I didn’t tell you that IDC, if you hadn’t seen, took down the forecast for the year.
Our internal models took it down a little bit as well. But we believe we continue to grow. As we said, low single digits, work to do. We plan to take share and outperform the marketplace.
Rob Williams: Okay. Thanks, Samik. We’ve got time for one more question and then we’ll turn it over to Jeff for some final thoughts.
Operator: Thank you. We’ll take our final question from David Vogt with UBS.
David Vogt: Great. Thanks, Rob, for squeezing me in. Yvonne, this is a question for you. If I take your comments and Jeff’s comments around AI servers, it appears that there’s a modest linear ramp in sort of the AI server revenue recognition as we go through 2Q and the second half of this year. If that’s right, what is the gating factor? I know you talked about it earlier and Jeff talked about different sort of customer priorities and timelines. But I’d imagine that you would have seen a stronger ramp in 2Q, 3Q and 4Q. And if that’s the case, are you seeing a stronger demand in 2Q that’s impacting gross margins that affects sort of the guide for this particular quarter?
Yvonne McGill: So, I’ll start on that from P&L perspective but I really do think it’s about the demand and Jeff hit on the, dynamic environment and supply environment that we are experiencing there and the backlog that continues to build right now. So, when I was talking about AI and how we recognize, server margins, talking about them being rate dilutive, that margin dollar or creative, talking about building the balance sheet up. So we’re deferring, we’re selling these solutions with services and that services gets recognized over time. And so we’ll continue to build that up. So when we look at the deal holistically on how we’re negotiating with the customer, what benefit we’ll receive over time, these are favorable deals for us.
But again, in the P&L, you have to build up that balance sheet and we have significant balance sheet strength across our entire portfolio. And we’re, we’re building this element of it as this is a new, I’ll call it a new startup business for us. I don’t know, Jeff, is there anything you’d add to that?
Jeff Clarke: No. We think the demand is there. You can look at many of the external sources and you can see demand for AI broadly deployed. The industry’s optimistic, whether that’s token growth, whether that’s data growth, whether that’s the flops required to process it. What’s happening is we build out these mega-clusters to train next generation AI models and applications. This is a demand driver for the course of the decade.
Yvonne McGill: And we’ll build up the balance sheet as the demand and the expansion in the enterprise and it will manifest into the P&L.
Jeff Clarke: Awesome. Thanks, everyone. We’re leveraging our engineering expertise to deliver open, modular, and comprehensive AI solutions for our customers from the largest hyperscaler data centers to enterprise customers across the globe. That’s something no other company in the industry can do as well as us. We’ve gained tremendous traction with our AI solutions over the past three quarters. And as you can see from our announcements at Dell Technologies World, our innovation engine is humming and we’re just getting started. Thanks for your time today.
Operator: This concludes today’s conference call. We appreciate your participation. You may disconnect at this time.