Credo Technology Group Holding Ltd (NASDAQ:CRDO) Q1 2024 Earnings Call Transcript August 24, 2023
Credo Technology Group Holding Ltd beats earnings expectations. Reported EPS is $0.03, expectations were $-0.03.
Operator: Ladies and gentlemen, thank you for standing by. At this time, all participants are in a listen-only mode. Later we’ll conduct a question-and-answer session. [Operator Instructions] I would now like to turn the conference over to Dan O’Neil. Please go ahead, sir.
Dan O’Neil: Good afternoon, and thank you for joining us on our first quarter earnings call for fiscal 2024. Joining me today from Credo are our Chief Executive Officer, Bill Brennan; and our Chief Financial Officer, Dan Fleming. I’d like to remind everyone that certain comments made in this call today may include forward-looking statements regarding expected future financial results, strategies and plans, future operations, the markets in which we operate, and other areas of discussion. These forward-looking statements are subject to risks and uncertainties that are discussed in detail in our documents filed with the SEC. It’s not possible for the company’s management to predict all risks nor can the company assess the impact of all factors on its business or the extent to which any factor or combination of factors may cause actual results to differ materially from those contained in any forward-looking statements.
Given these risks, uncertainties, and assumptions, the forward-looking events discussed during this call may not occur and actual results could differ materially and adversely from those anticipated or implied. The company undertakes no obligation to publicly update forward-looking statements for any reason after the date of this call to conform these statements to actual results or to changes in the company’s expectations except as required by law. Also during this call, we will refer to certain non-GAAP financial measures, which we consider to be important measures of the company’s performance. These non-GAAP financial measures are provided in addition to and not as a substitute for or superior to financial performance prepared in accordance with U.S. GAAP.
A discussion of why we use non-GAAP financial measures and reconciliations between our GAAP and non-GAAP financial measures is available in the earnings release we issued today, which can be accessed using the Investor Relations portion of our website. I’ll now turn the call over to our CEO. Bill?
Bill Brennan: Thank you, Dan, and thank you all for joining our Q1 fiscal ’24 earnings call. I’ll begin by providing an overview of our fiscal Q1 results. I’ll then highlight what we see going forward into fiscal ’24. Dan Fleming, our CFO will follow my remarks with a detailed discussion of our Q1 financial results and share the outlook for the second quarter. We would then be happy to take questions. For Q1, Credo reported revenue of $35.1 million. Additionally, we reported non-GAAP gross margin of 59.8%. Our Q1 results and future growth are driven by the accelerating market opportunity for high-speed connectivity solutions. Our electrical and optical connectivity solutions delivered leading performance of port speeds ranging from 50 gigabits per second up to 1.6 terabits per second.
While we primarily serve the data center Ethernet market today, we continue to extend into other standard-based markets as the need for higher speed and more power-efficient connectivity increases exponentially. The changing workloads in the data center specifically with regards to the onset of generative AI applications are driving the demand for higher bandwidth and higher-density networking. This plays directly in to Credo’s strengths. All of Credo’s connectivity solutions leverage our core SerDes technology and our unique customer-focused design approach, enabling Credo to deliver optimized, secure high-speed solutions with significant benefits in power efficiency and cost. Credo continues to see increasing customer engagements across our products and IP solutions, which include active electric cables or AECs, optical DSPs, laser drivers and TIAs, Line Card PHYs, SerDes Chiplets, and SerDes IP licensing.
I’ll now review our overall business to give more perspective. First, regarding our AEC business. Credo remains a pioneer in the AEC market. Industry analysts forecast increasing AEC market penetration as port speeds increase for intra-rack connectivity. Our AEC solutions offer significant benefits compared to both passive direct attached copper cables, which are physically cumbersome and poor signal integrity at higher speeds, and active optical cables or AOCs which are significantly higher power and higher cost. Today, our largest customer deploys our AECs for both general compute and AI applications. Additionally, we continue to design custom AEC solutions to solve for their next-generation deployments, including our first internally developed 100 gig per lane AI deployment.
At our second hyperscaler customer, our production ramp for both general compute and AI programs remains on track with expectations for continued growth throughout this fiscal year and fiscal ’25. We also continue to make progress on their next-generation platforms with Credo receiving commitments from multiple 100 gig per lane AEC programs. We attribute much of our success to our existing customers to our system-level approach to the AEC market. Our approach enables us to quickly respond to customers’ requests and deliver innovative feature-rich AEC solutions tailored to our customers’ specific requirements. This approach has led us to making further progress with additional hyperscalers who are in various stages of evaluation and qualification of our AECs. We’ve also seen a growing number of Tier 2 data center operators and service providers adopting Credo AEC solutions.
We have earned meaningful revenue from these customers to date and expect this customer category to grow in the future. In summary, we’re happy with our progress with customers and we’re encouraged by the accelerating market demand for 50-gig and 100-gig lane rates for in-rack connectivity. Regarding our optical solutions, within this market, we remain a disruptor. We leverage our core SerDes technology and customer-specific approach to deliver a compelling combination of performance, power, and cost. Credo’s optical solutions comprise DSPs, laser drivers, and TIAs for 50-gig through 800-gig port applications including optical transceivers and AOCs. We expect AI deployments to drive a large optical opportunity given the high-density rack-to-rack connections with AOCs or optical transceivers in the back-end RDMA network within a cluster.
I’m pleased that during Q1, we started the ramp-up of our 400-gig optical DSP for a U.S. hyperscaler. Our optical manufacturing partner is delivering 400-gig AOC solutions for an AI deployment at this hyperscaler. We expect the production ramp will continue throughout our fiscal ’24 and into fiscal ’25. We’re also seeing demand restart from data centers in China. While too early to create meaningful expectations, Credo stands to benefit as spending returns in this market. Credo has designs in progress with several optical manufacturers and hyperscalers targeting next-generation 800-gig and 400-gig programs and we expect ongoing progress in winning production commitments. Beyond the hyperscalers, we see additional optical opportunities with networking OEMs and service providers.
We remain engaged with many partners and prospective customers for Fibre Channel, 5G, OTN, and PON applications. The optical market seems to have turned the corner in the last couple of quarters. We aim to announce and demonstrate new optical solutions at upcoming optical trade shows later this calendar year and we remain optimistic about our prospects for our optical solutions business. Within our Line Card PHY business, we’re an established market leader with our Line Card PHY solutions for port speeds up to 1.6 terabits per second. We think our overall value proposition becomes even more compelling as the market is now accelerating for 100-gig per lane deployments. During the first quarter, we saw design engagements increasing specifically with our Screaming Eagle 1.6 terabit per second PHYs. We’ve stronger customer feedback that we have again achieved a leading combination of performance, signal integrity, and power efficiency and we’ve already had success in winning design commitments from leading networking OEMs and ODMs. We’ve also made significant development progress with our customer-sponsored next-generation 1.6 terabits per second MACsec PHY, which we believe will extend our MACsec leadership well into the future for applications requiring encryption.
Going forward, we expect to remain a leader in this category given our core technology differentiation and deep collaborative relationships with leading networking OEMs and ODMs, as well as hyperscalers directly. Regarding our SerDes IP licensing and SerDes chiplets business, while we see quarter-to-quarter variability in revenue for our SerDes IP licensing business, our customer traction and funnel remained consistently strong. We’ve seen a breadth of wins in this category, including 50-gig and 100-gig lane speeds and process nodes ranging from 5-nanometer to 28-nanometer. End applications include networking, AI, 5G, as well as a wide range of other applications. In addition to IP, we’ve also developed SerDes chiplets solutions with two high-profile lead customers reaching production, Credo is beginning to see meaningful revenue in our fiscal ’24.
One of our lead customers is Tesla and as they’ve publicly presented Credo is their connectivity partner for their Dojo supercomputer delivering SerDes IP for their D1 ASIC and SerDes chiplets for off-tiled connectivity. We’re receiving increased interest in our SerDes chiplets from additional customers and prospects, which supports industry expectations that chiplets will play an important future role in the highest-performance designs. To sum up, we’re happy with our results in fiscal Q1 and we’re encouraged about demand drivers for the balance of the year and beyond. Credo’s position as a market leader for high-speed connectivity solutions has been years in the making and the market acceleration towards high bandwidth solutions at low power with more networking density plays into our strengths.
We continue to expect sequential growth throughout fiscal ’24. We believe our growth will be led by multiple customers across our range of connectivity solutions, which would result in a more diversified revenue base as we exit fiscal ’24. I’ll now hand the call over to our CFO, Dan Fleming, who will provide additional details. Thank you.
Dan Fleming: Thank you, Bill, and good afternoon. I will first review our Q1 results and then discuss our outlook for Q2 of fiscal ’24. As a reminder, the following financials will be discussed on a non-GAAP basis unless otherwise noted. In Q1, we reported revenue of $35.1 million, up 9% sequentially and down 24% year-over-year. Our IP business generated $2.8 million of revenue in Q1, down 51% sequentially and down 73% year-over-year. IP remains a strategic part of our business, but as a reminder, our IP results may vary from quarter to quarter. Driven largely by specific deliverables to preexisting contracts. While the mix of IP and product revenue will vary in any given quarter over time, our revenue mix in Q1 was 8% IP, below our long-term expectation for IP, which is 10% to 15% of revenue.
We expect IP as a percentage of revenue to come in within our long-term expectations for fiscal ’24. Our product business generated $32.3 million of revenue in Q1, up 22% sequentially and down 10% year-over-year. Our team delivered Q1 gross margin of 59.8% at the high end of our guidance range and up 160 basis points sequentially due to product mix. Our IP gross margin generally hovers near 100% and was 94.8% in Q1. Our product gross margin was 56.8% in the quarter, up 703 basis points sequentially and up 551 basis points year-over-year, due principally to product mix. Total operating expenses in the first quarter were $27.4 million within our guidance range and up 1% sequentially and 21% year-over-year. Our year-over-year OpEx increase was a result of a 30% increase in R&D as we continue to invest in the resources to deliver innovative solutions.
Our SG&A was up 8% year-over-year as we built out public company infrastructure. Our operating loss was $6.4 million in Q1 compared to operating income of $5.3 million a year ago. Our operating margin was negative 18.3% in the quarter compared to 11.5% last year due to reduced top-line leverage. We reported a net loss of $4.7 million in Q1 compared to net income of $5.0 million last year. Cash flow from operations in the first quarter was $24.6 million, an increase of $36.8 million year-over-year, due largely to large receivables collected in the quarter. CapEx was $5.3 million in the quarter driven by R&D equipment spending. And free cash flow was positive $19.3 million, an increase of $36.8 million year-over-year. We ended the quarter with cash and equivalents of $237.6 million, an increase of $19.8 million from the previous quarter.
This increase in cash was a result of large receivables collected during the quarter. We remain well capitalized to continue investing in our growth opportunities while maintaining a substantial cash buffer. Our accounts receivable balance decreased 43.5% sequentially to $28.0 million while days sales outstanding decreased to 73 days down from 140 days in Q4 due to collection of several large receivables. Our Q1 ending inventory was $40.8 million, down $5.2 million sequentially. Now turning to our guidance. We currently expect revenue in Q2 of fiscal ’24 to be between $42 million and $44 million, up 23% sequentially at the midpoint. We expect Q2 gross margin to be within a range of 58% to 60%. We expect Q2 operating expenses to be between $27 million and $29 million.
We expect Q2 basic weighted average share count to be approximately 150 million shares. We’re pleased to see FY ’24 continue to play out as expected. While we see some near-term upside to our prior expectations, the rapid shift to AI workloads has driven new and broad-based customer engagement. We expect that this rapid shift will enable us to diversify our revenue throughout fiscal year ’24 and beyond as Bill alluded to. However, as new programs add new and existing customers ramp, we remain cautious about the back half of our fiscal year until we gain better visibility into forecasts. In summary, as we move forward through fiscal year ’24, we expect sequential revenue growth, expanding gross margins due to increasing scale and improving product mix, and modest sequential growth in operating expenses.
As a result, we look forward to driving operating leverage and returning to double-digit operating margins by Q4. And with that, I will open it up for questions.
See also Top 15 Agribusiness Companies in the World By Revenue and 15 Best Places to Live for Couples in Their 40s.
Q&A Session
Follow Credo Technology Group Holding Ltd
Follow Credo Technology Group Holding Ltd
Operator: Thank you. [Operator Instructions] And our first question comes from Toshiya Hari from Goldman Sachs. Your line is now open.
Toshiya Hari: Hi, good afternoon. Thank you so much for taking the question. Just one question from me. Bill, I was hoping you can talk about the revenue drivers for the current quarter. You’re guiding revenue up 23%. It’s really nice to see you guys reiterate the sequential growth in the back half as well. You talked extensively about your AEC business obviously your largest customer your number two customer, you seem to talk up the Tier 2 customers there as well. And then you sounded quite good on optical. So I was hoping you could rank order, the drivers again for the current quarter on a sequential basis, and what you’re most excited about as you go into the second half? Thank you.
Bill Brennan: So I can say that we’re seeing strength really across the board with all of the product lines that we’ve got from AECs to optical to Line Card PHYs, even SerDes chiplets. It’s hard for us to rank order in such a short period of time. We really see things – we see things generally moving in a positive direction. I’ll say that if you look at the year, I think we’ve been pretty clear thus far to say that our AEC business is not going to grow this year, and that’s really due to the fact that our largest customer had a big reduction in the forecast that they have for the current year. With that said, our other AEC business is growing quite rapidly as well as our other product lines. And so, we see significant growth year-over-year if we subtract the number from our largest customer.
So we feel pretty good about the way that demand is shaping up. I think we’ve got growing visibility for this year. And again, I see us really benefiting with the acceleration in lane rates generally. And it’s really positively impacting all of our businesses.
Toshiya Hari: Sorry, one quick follow-up. Thank you for that, Bill. Just the optical DSP business. You talked about the 400 gig opportunity with the U.S. hyperscaler and the production ramp in fiscal 24 through ’25. You also talked about China coming back a little bit. Again, if you can – I know it’s a relatively small percentage of your business today, but how should we think about the contribution from your DSP business over the next, call it 12 months to 18 months? Thank you.
Bill Brennan: Right. So I’ve said in the past that one of my benchmarks for this business is when do we achieve 10% of our overall revenue and I have signaled that we don’t expect that to happen this year, but we’ve got all of the activity that would suggest, we’ll see that in FY ’25. With that said, I feel good about where we are the fact that we’re executing to the ramp, we are seeing signs of life in China, we’re qualified at multiple hyperscalers there, and as spending returns in that market we will see benefit. Now, as far as what Dan has got built-in for the year, it’s not a big number for China, it’s relatively small number. But I think this year we’ll do a good job in approaching that 10% threshold, and then we’re – I think we’re on track for next year.
Toshiya Hari: Thank you.
Operator: Thank you. And one moment for our next question. And our next question comes from Quinn Bolton from Needham & Company. Your line is now open.
Quinn Bolton: Hi, Bill. Congratulations on the nice results. I just wanted to see if you could give a little bit more color on what you see going on at the largest customer. Obviously, they had a big forecast change back in February. Wondering if the forecasts have held fairly stable since that time? And I think if I caught it right in your prepared remarks, you mentioned AECs not only for general compute but also AI at your largest AEC customer. And I’m wondering if you could share some more details there? I thought the AI opportunity with your largest AEC customers is really more of a calendar ’24 maybe even calendar ’25 opportunity, so wondering if things have pulled in on that front?
Bill Brennan: So generally at our largest customer, we’re still hard at work executing to the backlog and forecast that we’ve got. We’re doing a good job there. The recent information that we’ve got as to how they’re using our current generation of AECs, we talked about AI deployments also with the need for front-end networks and we’ve learned that there are certainly add employments at this customer that are using our switch cable as part of that front-end network. And so it said, relatively significant portion of the AECs that we’re shipping today are going into AI applications in addition to general compute. I think it’s no secret that there has been a pretty big shift in their spend that we’ve seen over the last six months.
And as you know, it look long-term, I think we’re well positioned from the standpoint of both general compute and AI. I’ve talked a lot about the program that we’ve been working on for more than a year and this is an internally developed program that’s moving straight to 100 gig lanes, they’re going to use AECs for [interact] connectivity between their appliances and the top of rack switches for the back-end network. And then – so I expect us to participate in both the back-end and front-end network of those clusters that are deployed. And for next-generation general compute, as they move to 50 gig lane speeds, I think we’re well aligned with this customer on AEC solutions.
Quinn Bolton: And maybe just two quick clarifications there. Was that AI internally developed AI rack because that’s still kind of should we be thinking kind of later in calendar ’24 or early calendar ’25, any update on timing? And can you guys share what percent of revenue the largest customer was in the July quarter?
Bill Brennan: I’ll let Dan answer that.
Dan Fleming: Yes. So from a 10% customer basis, what you’ll see in a couple of days when we file our Q is that we had three 10% customers. The largest of which was 41% and so that largest customer has remained the largest customer over recent history. So obviously there are still a material contributor to our overall revenue mix.
Bill Brennan: And then as far as your question about timing, the timing on these new programs is really hard to nail down. I can say that it’s very highly prioritized within our customer and we’re very, very active in executing on that design and qualification. And so I think we’ve signaled that it’s possible for it to – for the early part of the ramp to happen within this fiscal year, but it’s really more of a fiscal ’25 overall around.
Quinn Bolton: Okay. And then for Dan. Congratulations on the gross margins, especially coming at the high end of the range with the IP mix being fairly low this quarter. You talked about margins benefiting from product mix. But it sounds like AECs are still pretty, pretty significant contribution to product revenues. Can you give us any more color on what’s driving that margin strength within the product group?
Dan Fleming: Well, it really is. It goes back to the theme that you’ll be picking up on here which is a lot of the current trends really are emphasizing the uptick in the need for 50 gig to 100 gig lanes and we’re seeing that benefit across all of our product lines. We spoke – if I were to call out a few even though they might not be 10% of the overall revenue mix both optical and chiplets contributed materially in the quarter, line card has always been a strong performer for us. So there’s just broad-based favorable product mix that we experienced within Q1. And as you rightly point out, our overall gross margin was 60% and we achieved that while IP was only 8% of our revenue mix. So we’re quite pleased with the performance of our operations group and the entire organization to achieve that.
Quinn Bolton: Yes. Nice job. Thank you.
Operator: And thank you. And one moment for our next question. And our next question comes from Matt Ramsay from TD Cowen. Your line is now open.
Matt Ramsay: Thank you very much. Good afternoon, guys. I guess my – for my first question, there’s a lot of things that picked up in the commentary over the last 20 minutes, it seems pretty positive three different 10% customers. The guidance being what it was sort of above consensus and the expectation to grow for the remainder of the fiscal year. And so I guess I’m wondering if you could detail a little bit more A) Are you now expecting to sort of grow for the fiscal year of ’24 over ’23? I know you guys have kind of set a ballpark for it being flat when the reset happened in February and a lot has happened since. And it seems like with the momentum you’re describing here you should easily be able to grow this year, but then there was some commentary that you gave, Bill, maybe a little bit more caution in the back half of the year in other parts of the business.
Maybe you could just kind of unpack that a little bit more and if you have a new sort of expectation for the year that would be helpful. Thanks.
Dan O’Neil: Yes, let me address that for him. Thanks for the question. We – so as I mentioned in my prepared remarks actually, we do remain cautious as you say in the back half because a lot of the shift that’s happened in hyperscale data center spend to AI, it’s given us a lot of opportunity to engage with customers that are existing customers and new customers on new programs that we’re having a lot of success in. But as these new customers and programs ramp, it’s always difficult to really pinpoint exactly the impact or how quickly they will ramp. So we remain cautious in the back half as we’re continuing to get better clarity in that back half. So if you were to boil that down to kind of simple math, Q1 we did what we did. Q2 if you take the midpoint of the guide, there is a little bit of upside over what the expectation was that we had said previously. But at this point, we’re not giving further for the guidance in the latter half of the year.
Matt Ramsay: Got it. No. Thank you. And understand the moving parts. I guess, as my follow-up I wanted to dig in a little bit more on the optical exciting to hear about the hyperscale win there. I’d be interested in two things. One, if you guys could spend a little bit more time maybe talking through the competitive advantage that you have in those products, particularly around sort of the N-1. The advantage that you have in some other parts. Does that apply to optical? And second, if you need to scale that optical business to support these large customer wins, what are the – I guess the points that you really need to hit and the limitations on scaling or the big sort of projects ahead of you there because it seems like that’s something that might need to ramped really quickly. And I’m just kind of trying to check on the capabilities there. Thank you.
Dan O’Neil: Sure. So I would say that the N-1 factor that we’ve talked about definitely applies to optical. In this market, we’re the disruptor. There is a strong incumbent position that Marvell has built, and they’ve done a very good job with that business. In reality, the optical industry is somewhat of a commodity business in the sense that this is a very difficult market for our end customers to compete in and so we play the role of disrupter and what we have to offer is excellent performance, excellent power, and with the disruptive value proposition. And that’s why we’ve been very much welcomed. I feel great about where we are with this generation of products. It’s been a long time coming that we’ve been able to prove absolute competitive performance in terms of bit air rates, absolutely competitive power, and delivering the kind of disruption that we expected with the N-1 process advantage that we’ve got.
And so that’s why we see a lot of activity. The Dominos I think will start to fall as one by one of our customers achieves production with hyperscalers. And from the standpoint of your question about capacity, really no issues. I expect no issues in ramping to any kind of volumes that could be expected. This is a pretty straightforward manufacturing challenge for our team. My team is used to building millions of units and with our supply partners we’re very much ready from a wafer and from a substrate and from a packaging and test perspective. So I see no issues with our ability to hit the kind of volumes that we could potentially be seeing.
Matt Ramsay: Thanks, Dan. Really appreciate it.
Operator: Thank you. One moment for our next question. And our next question comes from Karl Ackerman from BNP Paribas. Your line is now open.
Karl Ackerman: Yes, thank you, gentlemen. Two questions if I may. First, I guess how are you thinking about the competitive landscape for AECs? I asked it because our checks suggest that some of your peers have made incremental progress in that market, but really at the same time it sort of validates the AEC opportunity as Ethernet captures a growing portion of AI cluster. So your thoughts on that would be super helpful.
Dan O’Neil: Sure. I think that naturally, we’re going to see competition make progress. One of the things that I feel very good about is the way that we’re organized in comparison to the groups of companies that are trying to compete with us. We’re organized in a vertical fashion, I’ve got more than 100 people every day, they come to work, and they work on AEC system solutions. So in a sense, they’re really the customers of our IC development team. And so what that enables us to do is be very, very flexible and very quickly respond to special requests from customers. When we first thought about this market more than five years ago, we were thinking that it would be a standard products type of a market where we would put together a 400 gig on each end, 400 gig connector on each end.
And then it would be kind of a standard IEEE type of standard that we’d be achieving. But the deeper that we go with this, I can tell you that all of the high-volume programs that we’ve talked about all of them have special features that have been implemented. And so the customer base – we’ve talked at length about the work that we did for Microsoft enabling the dual-core architecture and delivering an intelligent AEC solution. The fact is, others try to compete, others never achieve qualification and we’re sole source of that program and that trend continued, so every one of our high-volume relationships, every one of our high-volume discussions, the engineers understand there’s an opportunity for innovation and our team is well organized to achieve those differentiated features that make their rack design that much more valuable.
I think that’s going to be a long-term advantage for us and it’s going to continue to be a competitive moat. As it relates to putting together a 400 gig kind of standard AEC or an 800 gig standard AEC, I think that’s where we’re going to start seeing competition make progress, but again, I think that’s a smaller volume part of the market.
Karl Ackerman: Yes. No, I appreciate that, Bill. For my follow-up, it seems the opportunity in front of you for your discrete optical DSPs is broadening and can address optical – active optical cable solutions, but I’m just curious, what are the margin implications as your discrete 400 gig DSP business ramps over the next couple of quarters? Thank you.
Dan O’Neil: Yes, we haven’t – what the guidance we’ve given with regard to how our products kind of lay across the gross margin spectrum, we expect optical in the near and long term to be at the higher end of our product portfolio. Not too dissimilar from standalone optical companies that you’ve seen in the past and tracked in the past. Our Line Card PHY business, it’s kind of right in the sweet spot as some of these other emerging product groups like chiplets and our AEC as a category is generally speaking below that target margin. And just for level setting, there has been no change to our long-term target model. Our long-term gross margin of 63% to 65% is our expectation, as these product lines mature and as we execute to our plans, and that’s with IP contributing 10% to 15% of the overall product mix.
Karl Ackerman: Thank you.
Operator: And thank you. One moment for our next question. And our next question comes from Suji Desilva from ROTH MKM. Your line is now open.
Suji Desilva: Hi, Bill. Hi, Dan. I wanted to clarify some comments you had in the prepared remarks around, I think you said Tier 2 customers, you’re making progress. I just want to understand are you referring to customers beyond your first two hyperscale customers, three, four, five, and six or referring to a second-tier of cloud customers? And if it is the latter, is it traditional or newer AI – gen-AI type applications?
Dan O’Neil: So we’ve mentioned before that, we’ve engaged with multiple customers that are using our AECs and these are not hyperscalers these would be considered Tier 2 and also we’ve had success with service providers as well. And so we mentioned that because the product category is I think really becoming quite solidified. When a new customer comes in and is looking at our solutions, it becomes almost like an automatic effective decision just based on the fact that you can’t really manage signal integrity or just basic physical size with passive copper, and optical is just a different equation on power double the power double the price, and so it becomes very easy decision. One of the announcements that we made in the last month or so was the work that we’ve done with Intel on their Habana Gaudi2 cluster that they’ve developed and that’s a great example of a customer that kind of quickly made the decision to make these intra-rack and rack to rack connections, the three-meter type connections with AECs. And there is a video that they did with us and it’s on our website for you to look at.
That’s one example of this type of customer. We’ve talked in the past about Comcast and within their infrastructure, they’ve got the need for switch racks and if you look at the big routers that are part of their infrastructure became a very easy decision for them to implement with Credo AECs. These are the types of examples, but we continue to work with what you would consider the hyperscalers and we continue to make progress. So we’re at different stages of evaluation development, different stages of engagement, but not at the point where I can report, adding that third large customer.
Suji Desilva: Okay. That’s definitely helpful, thanks. And then just to clarify sort of the landscape you’re selling into where some – there is Ethernet and there is some – there is certainly some InfiniBand out there. I mean, just the notion that you guys might at some point serve the InfiniBand market or that wouldn’t be a future opportunity and you expect Ethernet to take share versus InfiniBand. Just color here – some basic color would definitely help us. Thanks.
Dan Fleming: Yes, I think that NVIDIA has done just a fantastic job. I’m not sure if you’ve seen too many back-to-back quarters like what they reported. And so, there’s no question that the near-term InfiniBand is seeing great success for this the back-end networks within the NVIDIA clusters. With that said, we’re involved in many, many AI next-generation deployments, all of which are Ethernet. And I think if you rely on the market forecasters, they see a very clear coexistence between InfiniBand and Ethernet long term. I think you can measure it by dollars, you can also measure by ports and the expectation I think is that the number of ports in Ethernet is going to exceed that of InfiniBand sometime in the next few years. So I think that we remain very bullish on Ethernet within AI clusters, there’s no question about the amount of activity.
And there’s no surprise in the sense that the market is not going to be satisfied with the sole source provider. With that said, from the standpoint of our ability to do InfiniBand, there’s no limitations. So I can say in the optical space, we work with companies that are delivering both Ethernet and InfiniBand solutions. And as far as the AEC standard would go, it wouldn’t be a huge leap for us to develop those cables, we feel like there’s a market opportunity.
Suji Desilva: Okay. Very helpful, Bill. Thank you.
Operator: And thank you. And one moment for our next question. And our next question comes from Tore Svanberg from Stifel. Your line is now open.
Tore Svanberg: Yes, thank you, and congratulations on 400 DSP win. Bill, on that topic. It’s kind of interesting, right, because the 400 gig market is still in its early days. But we keep hearing from the industry that 800 gig is now ramping. So could you talk a little bit about your design wins and your ramps there? I mean obviously, now you’ve got this one win in 400 gig that’s going to ramp. But what about 800, when would that start to contribute more meaningfully to revenues?
Dan Fleming: So I think we need to look at the overall opportunity and maybe break it down between front-end and back-end networks. So I think from a front-end network perspective, the slower port speeds are still going to be very popular for a while. We see the market moving to 800 for the back-end networks in AI clusters that are moving at 100 gig lane rates and so the expectation for us on 800 gig is that we probably see that as an FY ’25 type of opportunity.
Tore Svanberg: Great. And I just had a follow-up on the question about the sort of caution for the second half of fiscal ’24. Obviously, there’s a lot of irons in the fire here, and I’m just wondering, is that caution kind of more tied to how these ramps go I’m especially thinking about your second largest AEC customer. I mean are you really concerned that it’s going to grow a lot for the next few quarters and then it digests? Any further color you could add there would be great. Thanks.
Bill Brennan: Yes, so I kind of smile thinking about the expectation that we set and talking about cautiousness because if we look at the quarter-to-quarter growth rates, I think they’re pretty exceptional as we climbed back to an equal level or larger level than we picked out prior to the big reset. So I think just beginning the year, I think we had plenty of challenge to maintain that kind of a growth rate, and I feel good about how we look right now. I feel good about the fact that exiting this year, I expect us to be more diversified, not just on a customer basis, but also on a product line basis, so I kind of smiled because we can take what is an exceptional story and we got to make sure that the treadmill doesn’t get sped up to the point where expectations are too high. So I would just say that cautiousness is really wrapped in an incredible story of very fast growth quarter-to-quarter.
Tore Svanberg: Great. And then on the second AEC customer ramps. I think you said that’s already in production. Is now sort of the main part of the ramp or does that come a little bit later?
Bill Brennan: It comes a little bit later. These programs, there’s various stages of qualification early production pilot production and then really a significant ramp. And as we’ve signaled we’ve been somewhat measured in what we’ve included in this year and it looks like it’s going to shape up as to be a much larger FY ’25 ramp.
Tore Svanberg: Sounds good. Thanks for the color. Really appreciate it.
Bill Brennan: Sure.
Operator: And thank you. And one moment for our next question. And our question comes from Brett Simpson from Arete Research. Your line is now open.
Brett Simpson: Yes, thanks very much. Bill, I wanted to just ask on SerDes Chiplets. Can you maybe just talk a little bit about how you see the landscape here over the next couple of years? And I think you called that Tesla is a initial launch customer for SerDes chiplets. And I think they’ve been talking publicly about sort of installing 100 extra flop cluster over the next sort of five or six quarters. Can you maybe just talk a bit about what specifically Credo is doing within this project and more broadly just talk about how you see chiplet IO for Credo particularly now that TSMC is starting to get more and more involved in SOIC and looking at AI customers here? Thank you.
Bill Brennan: Sure. So we’re happy with the relationship that we’ve got with Tesla. This has been multiple years that we’ve been working with them. I’ll first start with the fact that they licensed our 100 gig per lane SerDes IP and in each one of the D1 chips they’ve integrated 576 of those 100 gig lanes. So when we think about total throughput, that’s 57.6 terabits per second for each one of their D1 ASICs. This is why we were so attracted to working on something that was so advanced and it was even a long time ago. When we talk about 51.2 terabits per second switches, we talk about still that’s not expected for some time and so I think from a SerDes IP standpoint, this is a great example of the type of work that our team can do in collaborating with the leading edge design team.
If you look at their tile design, there’s 25 of the D1 ASICs that are really in a five-by-five matrix and surrounding the tile are off-tile connectivity and there’s 40 of our chiplets on each one of their titles. And the chiplets sort of 3.2 terabits per second also 100 gig per lane on each side, one side is XSR to manage the dotted icon activity and then off-chip is a longer reach or off title is longer reach. And so they have been public and talking about ramping. And so that’s where we see meaningful revenue. The second customer that we worked with is Intel and for their 12.8T switch, which is in production now also we did chiplet design for them also 3.2 terabits per second, a little bit different interconnect between die. This was what they call it, a white bus BOW a bunch of wires on the die-to-die connection and then off package was a longer reach 30s.
So we feel very good about that. We’re seeing more customer interest. When we talk about the chiplet business in general, there’s lots of different kinds of chiplets. We’re really specializing in connectivity. So it’s really SerDes chiplets are our target and we see that generally we expect it to be a large opportunity. One of the things that we’ve talked about in the past is our efforts for PCIe Gen 6. That effort is well underway. We see a big opportunity within the servers whether their compute or AI. Internally that network is managed with the PCIe standard. And so there is a bandwidth explosion that’s happening inside the box. And so we see a large opportunity for PCIe retimers as well as chiplets and people have talked about the UCIe standard as being a die-to-die interconnect-off package would be PCIe. So we’re definitely going to be in that market long-term.
Brett Simpson: And just a follow-up on this. I mean a lot of chipmakers are talking about next architectures for AI where they physically separating out the IO from their main compute – main accelerator. Can you talk a bit about what that means for Credo? How do you position for some of these next-gen architectures? And are you engaging with any of these sorts of projects at this stage? Thanks.
Bill Brennan: Yes, I think that my understanding is what I just spoke about this UCIe standard that’s being driven by Intel. We’re part of that group. We’re active and this is defining a standard where you can do chip-to-chip connectivity and then off-chip you can manage it in different ways. We think the right approach is to go with fast connections that are off-package and we’re going to bring the same kind of advantages to that opportunity that we’ve brought to everything we’ve been involved with which is faster connectivity with better power efficiency. But that’s – that I think is what maybe you’re referring to. That’s the way that I perceive it.
Brett Simpson: And, Dan, maybe just a final one. In terms of the guide for next quarter, I wanted to just ask about some of the licensors for your USB for V2. Are you guiding for any royalty revenues in the current quarter from some of these licensors or not? Thanks.
Bill Brennan: No, that’s all kind of beyond the fiscal year – the current fiscal year. We haven’t given guidance on that yet.
Brett Simpson: Okay, thank you.
Operator: Thank you. And one moment for our next question. And our next question comes from Vijay Rakesh from Mizuho. Your line is now open.
Vijay Rakesh: Yes, hi. Thanks, Bill, and Dan. Just a question on the AI side. Just wondering, I know you talked about maybe bigger ramp in ’24, ’25, but you also talked about 5 times content getting 20 servers per rack as you go to high. Any idea – any thoughts on how – what percent of your cables now go into on the AI side. And then as we look out is the ramp on AI with Habana only or do you see opportunities on the AMD MI300 et cetera as well?
Bill Brennan: I can say that the opportunity is broad for us. Anything that’s Ethernet, I think we see that as a big opportunity. As it relates to your question about overall percentages right now, we’re really in high-volume production with one customer, and we’ve got a second lined up that is at the early stages of ramp. It’s hard for me to really project without detailed forecasts, but my expectation is that both of those customers will eventually buy our solutions for their AI platforms in a significant way. And so I would say, I think general compute will continue to be large for us, but I think that AI will ultimately be where you’ll see the bulk of our cables really at 100 gig lane rates in the near future.
Vijay Rakesh: Got it. And good to see you guys diversifying [3%, 10%] customers now, but I had a question on the optical, I see the DSP side, I think most treat models probably had a flat calendar ’24 and a big ramp in ’25, is that how we should look at it or do you see given the level of engagement on the optical IC side and it looks like things are going smoothly there that ramp could get pulled in?
Bill Brennan: Yes, I think we’re comfortable with what we’ve discussed in the past. I think the activity that we’ve got right now is significant, it’s leading to production commitments. We’re seeing ramps begin. And I would say that I wouldn’t really change the expectation I think that FY ’25 is really where we expect to get the kind of production traction that would allow us to hit that milestone I talked about which is 10% of our overall revenue.
Vijay Rakesh: Got it. Great. Thanks a lot.
Operator: And thank you. And one moment for our next question. And our next question comes from Richard Shannon from Craig-Hallum Capital Group. Your line is now open.
Richard Shannon: Well, great, guys. Thanks for taking my question. I guess both of my question probably on DSP optical side here. You’ve talked about kind of early stage ramp here with a single hyperscaler. I’ve kind of two questions here. What sense do you get of your share position here? You’re a leader or fast follower and then to what degree are we seeing some follow along with either additional designs with that customer or additional hyperscalers ramping there?
Dan O’Neil: Yes. For this first hyperscaler, I would say that we’re one of their partners for the optical DSP. Hard for me to say what share we’re going to have at that hyperscaler. But I think it will be significant. And with others, I think we’ve got a lot of activity again and it’s really across the board for front-end and back-end networks. And I think we’ll be in position in the future to give more color on that.
Richard Shannon: Okay, fair enough. I’m going to follow up on another question on DSPs topic as well. I think it was Tore’s question by 800 gig DSP. I think your response was something along the lines of revenue is expected somewhere in that fiscal ’25 from that. What does that mean about in terms of having better visibility on wins there? Should we hear about that in the next quarter or two or is it going to be more into calendar ’24 before that happens?
Dan O’Neil: Yes, I think we’ll give updates when we’ve got something significant to report. I know that that’s appreciated when we give more color, but I really want to make sure that we’re locked in and we’ve got many, many shots on goal, so to speak, many opportunities that we’re working on right now.
Richard Shannon: Okay, fair enough. Look forward to those updates. That’s all from me, Bill. Thanks.
Operator: Thank you. There are no further questions at this time. Mr. Brennan, I will turn the call back over to you.
Bill Brennan: So thank you very much for the questions. We really appreciate the participation and we look forward to following up on the call backs. Thank you.
Operator: This concludes today’s conference call. You may now disconnect.