Cloudflare, Inc. (NYSE:NET) Q2 2023 Earnings Call Transcript

Matthew Prince: Yeah. I don’t know how accurate my crystal ball is. But I think that — I wouldn’t say that it feels like things are improving. It feels like things are plateauing. Q1 was really hard. The fact that we had in one quarter sales cycles increase 20% was a very big and frightening occurrence. And I think that, we were pretty early in earning seasons to call that, that there was a real concern across IT buyers, but that got reflected by many companies that came after us. What we saw in Q2 was that sales cycles returned back to be more in line with what we were seeing last year in 2022. 2022 was still elevated in the several years, prior to that. So what it feels like to me is that we’re in for a grind. And, not Cloudflare in particular, but across the entire economy.

And that that grind is going to be hard, but I think it’s it is actually serving us quite well because it’s forcing us towards operational excellence. And across our entire team, people are digging in, they’re working hard, and they’re making sure that every process is as efficient as possible. And I think that you’re right. One of the things that is unique about us versus a number of others is, as we look at our products that we’ve been able to achieve very high gross margins, I think that’s the best — one of the best indicators that we have a really differentiated platform. And as customers are looking for ways to consolidate their vendors, to find how to get more ROI out of everything they’re doing, they are turning to us, and our team is ready, and we have the right products.

And so I think the grind that’s ahead is actually something that — it’s going to be hard, but it’s something that I think I’m looking forward to, and we’re going to become a better company as a result of it.

Brent Thill: Thank you.

Operator: And we will take our next question from Keith Weiss with Morgan Stanley. Your line is open.

Keith Weiss: Excellent. Thank you guys for taking the question and definitely glad to hear things are stabilizing. Matt, I still want to dig into, the comment that you made, both in the press release and on the conference call about, Workers and Cloudflare being a really good platform for inference. Can you talk to us about sort of the underlying technical why of — why you think? It’s still pretty early days with these technologies, and we’re trying to figure out how inference plays out over time. So, I think your view would be helpful to the overall kind of industry conversation as well as the Cloudflare conversation on why you guys are well positioned. And then the follow-up is for Thomas. Whenever we hear inference, we’re thinking that this is GPU intensive and computing intensive type stuff and typically lower gross margin.

It sounds like you’re pretty comfortable that this isn’t impacting the gross margins in the near term. Is that just because it’s still relatively early days and relatively small volumes, or should we be thinking that this could, as this ramps, this could potentially have a bigger gross margin impact over time? Thank you.

Matthew Prince: So, Keith, I appreciate both questions. They’re really, really important to understand the advantages that we have in this space. So first of all, what are the challenges with inference? And I think there are really two. One seems like a bigger deal and is probably actually not as big a deal. And the other doesn’t seem like it’s a bigger deal, but it is a really big deal. And in both, I think that they, shape how we think the inference market is going to work out. So the one that kind of feels like it’s a bigger deal is around performance, which is that if you’re playing with the various generative AI companies, if you’re trying to do something, that wait time between when you submit a query and when you get back a response, that’s going to become a bigger and bigger differentiator between different AI platforms.

And so anything that you can do in order to make that performance as fast as possible is advantageous. And one of the ways to do that is to move the actual inference as close as possible to the person who’s requesting it. And so, again, we think that inference will primarily be done on device or very close to where the end user is, inside the network. We won’t get, again, if the ball is bouncing across street, you want that inference to be done on the device, on the driverless car itself. So we won’t win every inference task. But there will be a lot that makes sense to be running in the network where we have, again, almost infinite network capacity, almost infinite storage and memory and very, very significant, CPU and GPU resources to be able to run those inference tasks.

That I actually think will be the lesser of the two advantages, for us. The larger one, which again doesn’t feel like it’s a bigger deal, but we’re already seeing it play out some of the regulatory efforts that are happening around the world is that a lot of times for these inference tasks, the data that there is very private. People and governments want that to stay as close to the actual end user as possible. So we’ve already seen action in Italy that has restricted the use of certain AI tools because it sends data out of the country. What Cloudflare can uniquely do because we’re positioned across more than 250 cities worldwide, we are in the vast majority of countries worldwide is that we can actually process that information locally. So again, we think that on device we are very close to where the user is on Cloudflare’s network, is going to be the place where inference going to take place.

I’ll take a quick stab at your second question as well and then hand it off to Thomas for anything that he would add. [Technical Difficulty]