And increasingly, I think as you’re seeing with the executive order on AI, with some of the European regulation on AI, being able to keep AI local is, we think, going to be something that’s a real differentiation for us. The vector database, I think that’s actually a good question to ask about and sort of got lost in some of the other stories, but I think some of the more sort of people who are paying attention within the AI space, recognize how important that is, being able to fine-tune your models and have a database that’s built on top of the existing R2 infrastructure that we have, is something that not only allows us to do inference, but actually allows us to do fine-tuning as well, which gives us sort of two of the three major legs of the AI stool.
And that’s sort of my sneaky feature that I think is going to be pretty disruptive because you can use our vector database, whether you’re using the rest of Cloudflare’s AI systems, becomes a really great function for AI users, who are wanting to do fine tuning. And that, combined with the locality that we can deliver with the Workers AI system and inference scattered around, the entire world allows us to do something that is truly a complete AI ecosystem. And again, the AI developers that are paying attention, asked the same question, which is, Wow, how did you guys added vector database? And the good news is, again, all of these things are built on a lot of the primitives that we had before. We did have to go out and build something new. We can put GPUs in our existing servers.
We could build sectorize the vector database on top of R2 and some of the other primitives that we had out there. And we could learn from the huge number of AI startups that are already using Cloudflare, in terms of what tools they needed in their tool kit, and that’s what our team is delivering.
Andrew Nowinski: That’s great. Thanks Matt. Keep up the good work
Operator: Your next question comes from the line of Hamza Fodderwala from Morgan Stanley. Please go ahead.
Hamza Fodderwala: Hi. Good evening. Thanks for taking my question. And congrats on the solid result, in what’s been a pretty tough environment. Matthew, just you talked a lot about the AI inference opportunity and a lot of great color there. Could you just maybe level set and remind us of all the different sort of vectors for potential monetization over time? You talked about R2, potentially the vector database angle as well. But any others that we should consider? And then maybe a follow-up for Thomas. I believe a lot of this is sort of priced on a more consumption basis. So as the demand starts to ramp, should we start to see that more in real-time fashion as it relates to your revenue? Thank you.
Matthew Prince: Yes, I’ll start and then Thomas can add to it. I think there are three different areas in which we can see growth and delivery from AI. The one where, we’ve seen it now, for at least the last 18 months, is just in our traditional products, using Cloudflare’s security services to protect AI systems is absolutely critical. And as you go to some of the leading AI platforms that are out there, you’ll often see Cloudflare’s logo where we’re using AI systems ourselves actually, to check to make sure you’re a human being. Check to make sure that you’re not a threat before letting you on. So that’s – that, I think, is just our bread and butter, and what we can deliver very efficiently. The second area is with things like R2 and charging for storage.
And again, that’s going to be storing the models, storing the training sets for those models, using the fine-tuning data with R2 and vectorized to be able to process those models. And again, that’s going to be much more like a – as you said, a consumption-based approach. And then the third way is that we’re charging for inference. And what I think is unique about us, is because at core Cloudflare’s incredibly good as a routing and scheduling engine, that’s how we’re able to deliver the very high gross margins that we have compared with some others in the space, is that we just get a much higher degree of utilization, and we pass that on to our customers. And in this case, the way that we’re charging for our GPUs is termed by the industry as a serverless method of charging.
And what that means is we only charge you for when you’re actually running an inference task. And then we’re able to schedule that very effectively across our entire platform. And we think that, that’s going to be very — as disruptive in this space as some of the things that we’ve done with Workers have been — in the traditional space. And that’s something that is very attractive to AI developers. So I think those are the three ways that we see as monetization around this. One is, our — just traditional security products. Second is around storage of either training sets or models themselves, or the refining and fine-tuning model systems? Or then the third is, actually charging for what is effectively to compute capacity and doing that in a way that is, again, very disruptive, compared with some of the other providers that are in the space.
And we can often decrease people’s inference task costs pretty substantially, while that’s still being a very high-margin business for us. So we’re excited.
Thomas Seifert: Coming back to your second question. So today, the share of variable revenue – of our overall revenue is very, very low. But the ramp of the AI service and products that Matt has just mentioned, would increase this share. We’ve seen some of the strength actually in the third quarter from a revenue perspective already coming from variable revenue. So this is one data point. It’s not enough to make a good correlation or a trend. But with a higher share of variable, with products and services that are price variable, you would see a more immediate impact on revenue for sure. But we don’t have enough data yet to see how this will play out, but the first signs are encouraging.
Hamza Fodderwala: Thank you.
Operator: Your next question comes from the line of James Fish from Piper Sandler. Please go ahead.
James Fish: Hi guys. Thanks for the question. You guys have talked a lot about AI here, but where are we with getting more shots on goal with more of the wave 2 products in network security, and particularly — additionally, Thomas, more for you, well, net new customers were good. The dollars added just were a little bit lower than what we’ve seen in past few quarters. Is that just seeing a reduction in contract durations given the macro, or what other aspects are impacting this? And I’m sorry if I missed this, did you give an RPO number this quarter?
Matthew Prince: Yes, Jim, so I’ll take the first bit and then hand it off to Thomas for the second bit. I think we’re seeing real strength around the network security and our Zero Trust products. We’ve been recognized as leaders in those spaces by a number of the key analysts. That’s driven up the amount of interest. The pipeline for those products is extremely strong. And what we’re seeing is that, increasingly, customers want — especially in the sort of making every IT dollar go further, increasingly, they want to say, I don’t just want to protect the back door of my business. I want to protect the front door, the back door, the side door and all of the doors in the business. And so we’re the one vendor that is able to give people that vendor consolidation, that single pane of glass.
And I think that, that comes through in a lot of customer examples and stories that we’ve seen. And so what we’re seeing more and more is, people want to buy the entire Cloudflare platform. They want to protect their entire business with that, and that’s driving more interest in both our network security, as well as our Zero Trust products.
Thomas Seifert: RPO for the third quarter was $1.83 billion. I think it was part of my script. Expansion is getting better, DNR ticked up 1 percentage point. So it’s stabilizing. I think that is what we have been talking about in the previous earnings call that we see bottoming out. But I would still say that it is easier to have new logo acquisition than it is to expand with existing customers. And the trend we have seen that this might be impacted, timing-wise or budget-wise, by current macro concerns, I think still holds true. It has not changed materially from the third quarter – from the second quarter.
James Fish: Awesome. Thanks guys.
Operator: Your next question comes from the line of Shrenik Kothari from Baird. Please go ahead.
Shrenik Kothari: Yes. Thanks for taking my question. Congrats on the solid execution. I’d just like to switch gears a little bit to DDoS. So Matthew, I mean, of course, Cloudflare’s unique approach to DDoS pricing, definitely differs from the competition? And instead of tying the price to the size of the attack, you’ve opted for kind of a more customer-centric approach. So just curious in today’s elevated DDoS landscape, are you seeing the flexibility kind of appreciated by customers and not being charged based on the scale of the attack. Is it becoming a key driver for kind of stronger share gains? And then I have a quick follow-up for Thomas?