And GenAI is front and center, and we are doing boot camps. We are doing GenAI Workshops in a Box with our customers. These have been widely successful. So I’m very excited about the early momentum. Like you said, it’s going to take some time for this to build up. So we’re still in the early innings. But over the long haul, I believe this is going to be a very material increase in TAM for us.
Brent Thill: Thank you.
Operator: The next question comes from Andrew Sherman of TD Cowen. Please go ahead.
Andrew Sherman: Oh, great. Thanks. Hey, guys. Janesh, the annual cloud number seemed pretty strong — stable at mid-40s growth. So there’s really just that SMB pressure that’s kind of dragging down the overall number? And what are you assuming in that segment going forward?
Janesh Moorjani: Yes. I mean I think the math that you’re doing is pretty straightforward. And you’re right. I mean, fundamentally, we’ve been very pleased with the annual cloud selling motion, which is a sales-led motion, and the both the commitments that we’ve secured as Ash has been talking about and also the consumption that we’ve driven against that — against those commitments. And if I think about the SMB portion, as I mentioned a short while ago, that remains soft. It was consistent with where it’s been before. And in terms of thinking about the future, we won’t unpack the segment-based view on cloud here, but fundamentally, if I think about our cloud business overall, we continue to be really excited about the opportunity set there and expect that it will continue to grow faster than the rest of the business.
Andrew Sherman: And then, Ash, are you seeing anything yet in the pipeline potentially more customers reaching out to you because of potential disruption, your largest logs competitor?
Ash Kulkarni: Thanks for the question, Andrew. Look, there’s no specific disruption or change that I’d call out. What I will say is, the whole motion to get customers and prospects to consolidate onto our platform by displacing incumbents. We’ve been driving that for a while now. We see this as a tremendous opportunity. We have so many strengths when it comes to log analytics and SIEM and search and being always land with those — and then once we are in there, that gives us the opportunity to expand even further beyond that. And everything that’s going on in the market, but also our innovation, the ability for customers to now use capabilities like ESQL. We’ve talked to you in the past about Frozen Tier, which is tremendously beneficial for customers, our AI capabilities around our observability and security AI assistance. All of those are big factors why customers are now very comfortable and confident about moving over to our platform, and that’s really exciting.
Andrew Sherman: Great, thanks guys.
Operator: The next question comes from Koji Akida of Bank of America. Please go ahead.
Koji Akida: Hey guys, thanks for taking the questions. In the past, when we think about the business in terms of enterprise search, security and observability, I mean in the past, you’ve talked about ways to think about the growth rates between those three categories. So wondering if you could maybe get a little granular here and let us know how you’re thinking about growth rates between these three categories over the medium term to think about kind of your medium-term growth aspirations?
Janesh Moorjani: Koji, maybe I’ll take a stab at this and then invite Ash to add more. If I think about the mix across the solution areas, overall, Q3 was consistent with what we’ve generally seen in the last year in terms of ACV. I think it bounces around a little bit based on deal flow and renewal contracts, timing and so forth. If I think about the medium term, all 3 solution areas are growing nicely for us. If there is any mix shift, if it occurs, will take time. So we’re not actively looking to change the mix. As I think out over the future, search should benefit from GenAI that, as we’ve shared before, we view that as an expansion of the TAM. And so that presents a significant opportunity for us. But we also see a pretty significant opportunity to drive growth in security and observability with the consolidation motion that we’ve been seeing as well as with the strength of the GenAI capabilities that we build into the stack.
They actually have an effect in helping both the security and observability as well. For example, with the assistance that — the AI assistance that we’ve launched and have talked about before. So that’s the way I think about it for the medium term. Ash, anything you’d add there?
Ash Kulkarni: No, I think you nailed it, Janesh. Nothing more to add.
Koji Akida: Great, guys, thanks so much. Thank you.
Janesh Moorjani: Thanks, Koji.
Operator: The next question comes from Brad Reback of Stifel. Please go ahead.
Brad Reback: Thanks very much. Janesh, as part of these investments that you’re talking about for next year and beyond, do you need to make COGS investments to support all this AI functionality?
Janesh Moorjani: Brad, as I think about our investment profile for the future, we will see a natural increase in COGS. I think that’s more a function of the cloud growth that we see. We’re not expecting to see anything structurally significantly different. That said, as we launch our serverless offering over the coming months. There may be some initial investments that we have related to that, but we will factor that into the model when we provide guidance more formally for fiscal ’25. But there’s nothing major I would call out at this stage.
Brad Reback: Perfect. Thanks very much.
Ash Kulkarni: Janesh, if I might just quickly add to that. One of the things that I want to make sure that everybody understands is, we don’t need things like GPUs for us to be able to deliver the RAG functionality. We run our capabilities very, very efficiently on CPUs, and we have instance types in the cloud that are optimized for that, that are just regular instance types, time memory instance types. And so from a COGS perspective, one of the reasons why customers really love our offering in the area of GenAI is because of the overall efficiency. So I just make sure that I add that because I want to make sure that if there are any questions around GPU usage and so on, there is some clarity around it.
Brad Reback: Great, thank you.
Operator: The next question comes from Mike Cikos of Needham & Co. Please go ahead.