And these are all parts of our discussions when we address their requirements on their future purchases when we address LTA requirements. And of course, we need the necessary investments related to our production mix in terms of die requirements, in terms of our assembly and test requirements, and we really work closely with our customers to help manage these. And just keep in mind that — as I mentioned, that a lot of new product considerations go into the LTAs as well, as well as, of course, the volume and overall demand and supply considerations. So LTAs, at the end, really help both the parties. They help us plan our engineering, our product roadmap, alignment on that, our investments in things such as back-end capacity because products like HBM, product like high-density modules and, of course, in the mobile sector, products like MCPs, et cetera, have all different considerations at the back end.
And these are the kind of things, LTAs really help us plan with our customers.
Timothy Arcuri: Thanks so much.
Operator: Thank you. One moment for our next question. And our next question comes from the line of Krish Sankar from TD Cowen. Your question please.
Krish Sankar: Thanks for taking my question and congrats on the good results. Sanjay, the first question I wanted to ask you was you spoke about AI servers having 6 to 8 times more DRAM content and that demand is strong while traditional data center server demand is weak. There’s a view that some of these AI servers are replacing over 10 of the DCs — regular DC servers. So I’m just kind of curious how to think about overall DRAM demand as AI grows but probably cannibalizes some of your regular data center server DRAM content? And then, I have a follow-up.
Sanjay Mehrotra: So look, when we look at the overall DRAM demand, the DRAM TAM, of course, the AI is driving growth. Automotive, certainly driving growth. Other end markets, such as we mentioned, mobile and PC, in terms of — or consumer, in terms of their end demand, has been somewhat lackluster. The AI demand that is driven in data center, whether it is in the enterprise definitely drives healthy trends for memory growth. Yes, enterprise server and some of the data center demand has been recently somewhat impacted by the macro trends, but the trend of AI and more memory is absolutely continuing. And that’s what — when we look at our overall 2023 demand growth and the projections of CAGR that we have ahead of us, we have taken those into account.
This is very, very early innings for AI, and AI is really pervasive. It’s everywhere in, of course, cloud applications, enterprise server applications, applications such as generative AI would be in enterprises too. Because due to confidentiality of data, enterprises will be building their own large language models. And as you know, while the enterprise large language models may not be as large as the large language models you may see, and examples such as super clusters, et cetera, but all of them are really tending towards greater number of parameters. Now we are talking about parameters with generative AI getting into even trillion parameter range. Not too long ago, these used to be in 100 millions of range. That requires more memory. So regardless of the applications, whether it is on the enterprise side or on the cloud server side, the memory requirements are continuing to increase.