I think the second part of your sort of question with regards to product revenue sort of growth and the timing of DDR4, DDR5 for next year. As a company, we only guide sort of 1 quarter at a time, which is prudent, especially given the macro uncertainty just now. We’ve been very pleased with our execution in 2023 and really assuming the midpoint of our guidance for Q4, our 2023 product revenue relatively flat at $226 million, which really provides us a foundation for us to grow next year. And it is important to put this performance in context because the market has declined double digits this year. And we do expect to grow market share in 2023 versus the competitors. As it relates to 2024, we do anticipate growth, which is in line with all the industry research that’s out there just now.
And we’re excited about our competitive product offerings will be the continuation in growth of DDR5 platforms. We do anticipate the growth of recovery of DDR4. And as Luc mentioned, we expect to see the greater contribution from the companion chips in the second half next year. So overall, our products are very well positioned in the market. We do expect to grow faster than market next year and to see continued share gains in 2024.
Sidney Ho: Okay, that’s helpful. Then my follow-up question is you guys touched upon this in the prepared remarks, but it sounds like you guys are very well positioned to benefit from AI. Can you help us summarize the different ways that you can benefit from that ramp both from a product and licensing standpoint? I understand the memory licensing business is kind of fixed. So I just want to make sure to understand the opportunity.
Desmond Lynch: Yes, so if you look at the 3 pillars of our business and of course, our patent licensing business is not affected by AI. On the product side, let’s start with the buffer chips. AI servers do actually some general purpose servers in the AI boxes. And typically, those general purpose servers have a high memory content and typically DDR5. So seeing the impact of AI, especially this year, although the market was a bit depressed in total, the positive impact of AI has been to accelerate the demand for DDR5 modules. And this, I think, explains partially the profile of our DDR4 DDR5 mix between Q1, Q2 and Q3 of this year. With respect to our business. What we see with AI is that we see the emergence of specialized compute nodes, disaggregated architectures.
So all of these chips have to communicate between themselves. So CXL and PCIe IP become very, very important for our silicon customers as they build these heterogeneous chips that go into the new architectures. GDDR and HBM, of course, that’s why we announced our next-generation HBM at very, very high speed. We want to stay ahead of the curve there. And finally, as we mentioned in the prepared remarks, with all of these specialized chips now in the data center, the vulnerability with respect to attacks on data at rest or actually data in motion between ships is becoming more important. And therefore, our security IP portfolio is becoming more relevant to the market. So as we sold our high business to Cadence, we said we would continue to invest in our IP portfolio because we do see the opportunity brought by with all the semiconductor companies that actually build chips for that market.
Operator: Our next question is from Mehdi Rossini with SIG.
Mehdi Hosseini: Yes. So just a quick clarification. Look, I just want to go back to your prior statement. And this comes up every earnings conference call. Would it be fair to say that your product revenue is mostly driven by the number of temp and not necessarily with the DDR5 number of bits. In other words, your business is units of HBIM driven, not bits of DDR5. Would that be a fair statement?
Luc Seraphin: That’s correct. There are several factors going into the capacity equation if you wish. One being the density of the memory itself. But the use of commerce, as you rightfully say, for our buffer shape is really the number of [indiscernible].