James E. Moylan, Jr.: Rather than give you a number for our growth rate in orders next year, what I would say is this, a healthy business which we’ve had for a long, long time prior to the supply chain disruption and COVID and all that sort of stuff, we typically ran orders in a given year at some fraction above our revenue, whether it was 5%, whether it was 10%, some number like that. So we’re not at those levels right now. Which means that we do have to have some catch-up, as we move through the next couple of quarters and possibly some catch up next year. But that’s what we expect our order volumes to be, 1.05 times to 1.1 times of revenue for a year, for example, and it varies by quarter.
Gary B. Smith: And that’s obviously a function of the lead time piece as well. So typically, if the world ever gets normalized, it would be slightly ahead of the revenues for the year. Now there may be some bumpiness as we get into that. I mean, for example, in 2022, I think our orders were close to $6 billion, just to give you an order of magnitude around the challenges that we’re having from a backlog point of view.
David Vogt: Got it, helpful Gary. Thanks Jim. Appreciate it.
Operator: Next question comes from Ruben Roy with Stifel. Please go ahead.
Ruben Roy: Yes, thank you for taking my question. Gary, I had a follow-up on some of the commentary around AI and then just increasing order rates with cloud. And just trying to work through not seeing yet sort of the impacts of traffic growth outside of the data center, the traffic that you mentioned being created by the GPs, etcetera, and yet the order rates are up. So, am I right in assuming that the cloud DCI business is mostly for a long haul and if that’s the case, can you talk to sort of how you’re thinking about the sustainability of sort of those orders around that specific business, cloud DCI?
Gary B. Smith: Yes. I think it’s more of a mix than you think around long haul and metro amongst these. When we talk about them and we all tend to, these hyperscalers as a sort of generic grouping. You think about their business models, they’re all very different, be it search, be it cloud, etcetera, azure type services. So, therefore, their networks are actually very different as well. And including all of the submarine cables and the metro piece and the rest of it. So these are now very large, very complicated global networks that is not just simple data center connectivity, point to point. They are — they use 6,500 in full configuration with resilience, etcetera, across there. So they are fully blown intelligent networks.
So, to your point, Ruben, around that on the traffic they’re obviously flowing right now is data center to data center. The opportunity is when you get the AI applications coming out of the data center to monetize, they have to go to the WAN. They have to go to consumers in their various forms, be them enterprise or general consumers. And it has to pass across that network. And also from a model point of view, it needs to talk to the instantiation locally, be Edge compute or whatever the devices are, it needs to maintain connectivity to it. It’s not just you dump the model down there at the edge of the network and you’re good to go for a month. This stuff has to maintain connectivity. So we’re all sort of — we’re excited about the opportunity, but that hasn’t yet come out of the data center, but they’re obviously investing massive amounts of investment in the compute and in the application and monetization of that.
That has — that investment has not flowed to the network yet.
Scott McFeely: The other dynamic on the AI piece, having impact on the WAN traffic is because of the massive amounts of compute and the power required to do that. Every one of the cloud providers is talking about the need to further distribute their compute platforms. And that’s going to mean more data centers, more geographical distribution. And guess what, when you do that, you’ve got to network with them together. So that’s going to be more transport, more networking gear. That’s going to be another dynamic as AI starts to have an influence on, I’ll just say, the classic transport part of the network. I think I’d add, Gary, in terms of where are we today with these folks. It’s — yes, it’s their campus/metro DCI, it’s their terrestrial core networks.
It’s their submarine networks, but it’s also across our transport portfolio, it’s line systems. It’s Coherent modems and however they want to instantiate it. And to some people’s surprise, it’s our software portfolio and it’s our services portfolio as well. So, it’s quite a broad set of solutions that were in those — in that segment with.
Ruben Roy: I really appreciate the detail, guys. That’s really helpful for me. If I can sneak in a quick one for Jim on gross margins. Jim, just sort of maintaining the mid-40s for the full year, given lower volumes and lower revenue. Can you just — I might have missed this, but did you talk through mix or linearity of how you think gross margins play out over the next several quarters?