So there — we’re very pleased with how we performed in the second quarter. And both CSG and ISG that are expecting that competitive pressure to — and mix dynamics, especially in the third quarter coming through in ISG as we mix less towards storage seasonally, and we’ll mix back into the storage element in the fourth quarter.
Operator: We’ll take our next question from Aaron Rakers with Wells Fargo.
Aaron Rakers : Yes. Thanks for the question and also some of the results. I’m going to shift away from the AI narrative and maybe talk a little bit about the balance sheet side of the equation for Dell. I think in your prepared comments, you highlighted that you’ve now got, I think it was close to $9.9 billion of cash on the balance sheet. And one of the things you also mentioned was improving flexibility or increased flexibility on capital return side. So I’m curious of how you’re thinking about capital return relative to M&A? And any context of how much capacity or maybe put another way, how much operational cash do you necessarily need as we think about the excess cash that you’re carrying on the balance sheet on top of the free cash flow generation for the company.
Tyler Johnson: Yes, this is Tyler. Maybe I’ll start and Avon might step in. But I think we’ve always talked about our minimum cash balances being somewhere around, call it, $4 billion to $5 billion. Now it doesn’t necessarily mean I’m going to run at those levels. But clearly, we have excess cash. As you saw, it was a really strong cash quarter. And actually, if you look at the first half of the year at $5 billion of CFI, that’s a record, right? So great job by the teams on working capital and something we’ve been extremely focused on. So it’s great to see the progress that we made. So just like Yvonne mentioned during the opening remarks that does give us more flexibility. As I think about our capital return framework, nothing has changed there.
So as we look at share repurchase, for example, we look at dilution first and then we think are opportunistic buys. And I think this just gives us more flexibility as we’re working through that. We don’t provide guidance on that, but obviously something that we’re looking at.
Operator: We’ll take our next question from Erik Woodring with Morgan Stanley.
Erik Woodring : Jeff, I hope to maybe spend some time on storage in the quarter, revenue down just 3% year-over-year. I imagine that was better than expected. Can you maybe just walk us through what you’re seeing in the storage market? Where do you think we are in the cycle? Maybe said differently, how much of the performance in 2Q was Dell specific versus the market? And if you could weave in if there is any storage pull-through from AI servers, that would be helpful as well?
Jeff Clarke: Sure. If you — I mean, our performance was primarily driven by our strength in HCI, most notably, our PowerFlex, which is our proprietary software-defined storage solution and its growth. We’re seeing great momentum there. Its ability to independently scale compute and storage for high-performance applications. We’re seeing that technology being embraced in the marketplace. And it clearly has grown. I think I made a remark that it’s grown now I think it’s 8 consecutive quarters, and it grew triple digits, more than doubled in the quarter. So that certainly was a highlight of the portfolio in storage. PowerStore, our mid-range offering now has grown 12 consecutive quarters in a row. It is the mainstay of our mid-range offer, so that continues to be a strength of the business, particularly given the largest multinational customers in the world are very guarded in their buying, being able to sell to large corporates, large midsized, medium-sized businesses, certainly is the home of where our midrange product is.
I mentioned earlier in one of the questions that our high-end storage is going through that down cycle where we saw the mainframe refresh we saw a buildup through the COVID time or now in the digestion of that capacity that was brought online. That’s the backdrop of our storage business. I’m very optimistic. We’re working to get tighter correlation that the AI compute side should be driving the unstructured storage side and the object storage side, our ECS business. When you think about the large amounts of data this is going through and I think we’ll really see this as enterprises deploy AI more broadly that the unstructured data in its various forms all-in structured will be looking for highly scalable solutions, and we have the most highly scalable, high-performance unstructured systems in the market with our PowerScale and ECS object storage.
So I’m optimistic I would tell you there’s not a tight correlation to the moment. Most of it is compute with software-defined storage inside that compute. I hope that helped.
Operator: We’ll take our next question from Amit Daryanani with Evercore.
Amit Daryanani : Congrats on a nice set of numbers here. maybe just go back to the AI server opportunity. I was hoping you folks could talk about who are the customers that are buying this from Dell? Are they traditional enterprises, are the hyperscale customers? And I guess Jeff, maybe just help us understand what is Dell’s value proposition when it comes to GPU-enabled services? Because I think the concern might just be that what’s the durability of these revenues, if they’re coming from customers that are typically ODMs and are coming to you but they can’t get GPU allocations. Can you maybe just talk about the value prop Dell is providing who the base is? And is this accretive or dilutive due then to your ISG margins?
Jeff Clarke: Must be four or five questions in that question. So let me work my way through that. I mean, clearly, we believe today and my words, hopefully, at the opening resonated that this is a big incremental opportunity and that these new workloads demand a new type of architecture and new type of technology. We believe we’ve hit the sweet spot with that with our XE9680, for example. But there are three other AI servers in our portfolio as well. If you think about the 9680, why is it an interesting product? I mean, clearly, we work closely with NVIDIA over three years tuning its performance. We believe it’s the highest performance, most dense AI server you can buy today. You can give it as a 6U product. We think about it as power efficiency what we’ve been able to do around air cooling at ambient temperature of 35 degrees Celsius, what we’ve done with iDRAC, what we’ve done around the connectivity side with 10 PCI-E ports for it to work high-performance clusters.
So we’ve built something purposely for AI. If you think about the types of services that we announced at Dell Technologies World and subsequently, a broader range of services with Helix, the ability to help enterprises deploy this, help them understand where their data is, how to get their data prepared how to implement the infrastructure with ease and how to begin to train models, tune models and then ultimately be able to run inference at the edge or in their data center. That package of services and capabilities we’re just in the beginning of the types of customers we’re selling today, it’s a wide range there is a density of that today with some of the new AI as a service companies. We’re seeing enterprises, as I mentioned, early buy in small volumes so they can do proof of concept, so they can begin to understand, test do that sort of work.
But if you look at the long-term attributes of this opportunity, we think it’s AI in a lot of places at the edge at the data center, in the cloud. It’s going to track the data. In my mind and architecturally when we look at this in every way that we’ve looked at it, AI is going to follow the data. It’s highly unlikely you’re going to have a smart factory or a smart hospital or a set of robots that are going to continuously look to be trained or run inference a long way away. Latency will matter. We think security will matter. We think performance will matter, and we ultimately think cost will matter. And when you put that equation together, we think it’s going to be a hybrid world. There will be some AI done in the cloud. We’ll be done some AI do on-prem.
We think it’s going to be very, very heterogeneous in the way that this will be done with classic compute as well as accelerated compute. In a nutshell, that’s what we think of the opportunity. I certainly could go into more detail, but I think I hit a series of questions, at least on the surface.
Operator: We’ll take our next question from Wamsi Mohan with Bank of America.
Wamsi Mohan: Your primary USB-C competitor noted a lower PC TAM in 2023, also a backdrop of a more challenging component cost environment. I was wondering maybe, Jeff, you could talk about the trends in commercial and consumer through the rest of this year and set up into next year from a PC TAM perspective and also from a margin perspective?
Jeff Clarke: Sure. Our view of the market hasn’t materially changed from our last call. I hear — maybe that’s on our end, a scratch. I’ll try to talk over that. We see the market at roughly 250 million units, which would roughly have it down 15% over last year. Two consecutive years down, we see it slowing in the second half against easier compares. That’s reflected in what Yvonne just gave us guidance for both Q3 and tied into the year. So we see PC rate of decline slowing and to the point where we head into next year, and we’re optimistic that there is growth in the PC next year. Low single digits. We can debate that number. If you told us, we’d give you, say, probably in the 3% to 4% range is what we think today the opportunity to grow next year.