Toshiya Hari: Thank you.
Operator: Thank you. [Operator Instructions] And our next question comes from the line of Timothy Arcuri from UBS. Your question, please.
Timothy Arcuri: Thanks a lot. I had a question on these prepays that you’re getting. That’s $600 million. That’s a pretty big number. So is that more of a one-time deal or should we expect these, you know, prepays to continue? And as part of that, is that mostly related to say HBM or a particular vertical like data center or is that across, you know, most of your end markets? And then I had a follow-up as well.
Sanjay Mehrotra: So obviously, due to the confidential nature of this agreement, we cannot provide any specifics around these prepayments. But what I would like to point out is that it does reflect the importance of our technology, our products, and our delivery capabilities. It also reflects our close relationships with our partners and, you know, commitment from both sides from our — a good example of commitment from customers as well as from Micron. Beyond that, I’m not able to really provide any specifics here. And again, honoring the confidentiality.
Timothy Arcuri: Okay. Okay, got that. I guess then can you talk, Sanjay, just about limiting the bit shipments? I think you said you’re limiting the bit shipments to prevent pull-ins ahead of price increases. Sounds like bits are flat sequentially for fiscal Q2 and fiscal Q3. Can you talk about the logistics of that? Are you just kind of holding back on volumes to regain some, you know, pricing leverage? I guess if your competitors don’t match that approach, you might risk losing some shares. So can you just talk about the logistics of that? Thanks.
Sanjay Mehrotra: Well, as we noted that leading-edge supply is already tight. And, you know, so that, you know, certainly impacts in our FQ2, some of the shipments. FQ2, of course, is also impacted by seasonality. And — so supply is — managing supply, given the tight environment of supply on the leading nodes, is really the main consideration in terms of us guiding you to this profile. And of course, as we manage, as we allocate that supply across the customers, we want to make sure that we are managing our shipments to our customers carefully.
Timothy Arcuri: Okay. Thank you so much.
Operator: Thank you. [Operator Instructions] Our next question comes from the line of Mehdi Hosseini from Susquehanna Financial. Your question, please.
Mehdi Hosseini: Yes. Thanks for taking my question. Two follow ups. Sanjay, historically, memory industry tends to gravitate towards higher-margin products, which has historically led to margin erosion. Why is HBM any different? And what are your thoughts? Anything you can share with us as to what can preserve the higher margin associated with these high-end products? And I have a follow-up.
Sanjay Mehrotra: I think, you know, AI is in very early innings. Gen AI is barely starting and these are great growth opportunities ahead. You know, this is, you know, biggest revolution since Internet and recently you have heard industry estimates of data center AI accelerator TAM CAGR of being about 70% over the course of next few years. Of course, as those opportunities grow with data center accelerators, you know, from various suppliers, of course, the whole infrastructure grows and it’s about AI and Gen AI applications from training to inferencing and really proliferating all across, you know, the data center environment. And that’s where — and along with the growth in data center AI accelerators, the rest of the infrastructure, including HBM, will continue to grow.
So we project that HBM CAGR will be over 50% over the course of next few years. And when you think about it, that is more than 3 times of the DRAM industry CAGR that we are talking about. And still, we are in the very, very early innings. 2023 is the first year of meaningful shipments of HBM in the industry and that too corresponds to low-single-digit percentage in terms of bits shipped in HBM this year, but much higher pricing, much higher revenue opportunity. So as we look ahead, we see HBM continuing to grow strongly in the industry. Its demand will grow. It will be a key enabler of Gen AI applications in training as well as inferencing because more and more data is required. As you look at more and more, larger and larger large language models, and more training on more data just drives more demand for high bandwidth, high performance, low power memory.
So this is the very beginning. It has long ways to go. And the other important factor with HBM, as we have discussed is, that it really takes more than 2 times as many wafers to produce the same number of bits as D5. So it really has — it is a headwind to the supply growth in the DRAM industry and it has the effect of helping strengthen the supply-demand balance of the industry as well. So I think these are, you know, some of the important aspects. Of course, when — so important thing is that this has to be looked at as a long-term opportunity, long-term growth opportunity. And of course, you know, we are excited about getting our HBM share to align with our DRAM share some time in 2025. And of course, as we look at any large opportunity, you know, over time, it will certainly have some ebb and flow in terms of demand and supply and we will prudently manage this.
And maintaining flexibility in managing this is absolutely key. And you have seen us manage this well over time in our overall industry for DRAM as well as for NAND on part of Micron, and we’ll continue to manage it in that fashion. It is an exciting opportunity. We are well positioned with our product and we look forward to continuing to grow revenue and profit contribution with this product line over the course of next few years.
Mehdi Hosseini: Great. Thanks for detail. Just a quick follow-up for Mark. Is there a normalized capital intensity that we should think of, especially as we come off this kind of nuclear winter in memory? Is there any — if you don’t have a normalized capital intensity, what else out there that could help us better forecast free cash flows?