Salesforce, Inc. (NYSE:CRM) Q1 2023 Earnings Call Transcript May 31, 2023
Salesforce, Inc. misses on earnings expectations. Reported EPS is $0.41 EPS, expectations were $1.61.
Operator: Welcome to Salesforce Fiscal 2024 First Quarter Results Conference Call. All lines have been placed on mute to prevent any background noise. [Operator Instructions] I would like to hand over the conference to your speaker, Mike Spencer, Executive Vice President of Investor Relations. Sir, you may begin.
Mike Spencer: Good afternoon and thanks for joining us today on our fiscal 2024 first quarter results conference call. Our press release, SEC filings, and a replay of today’s call can be found on our website. With me on the call today is Marc Benioff, Chair and CEO; Amy Weaver, President and Chief Finance Officer; and Brian Millham, President and Chief Operating Officer. As a reminder, our commentary today will include non-GAAP measures. Reconciliations between our GAAP and non-GAAP results and guidance can be found in our earnings and press release. Some of our comments today may contain forward-looking statements and are subject to risks, uncertainties, and assumptions, which could change. Should any of these risks materialize or should our assumptions prove to be incorrect, actual company results could differ materially from these forward-looking statements.
A description of these risks, uncertainties, and assumptions and other factors that could affect our financial results is included in our SEC filings, including our most recent report on Forms 10-K, 10-Q, and any other SEC filings. Except as required by law, we do not undertake any responsibility to update these forward-looking statements. And with that, let me hand the call to Marc.
Marc Benioff: Thanks, Mike, and thank you all for being on the call. On our last call in March, we told you about how Salesforce had radically accelerated our transformation to profitable growth. We share with you how we hit the hyperspace button across the key areas of our transformation, restructuring for the short and long-term, reigniting our performance culture by focusing on productivity, operational excellence, and profitability, prioritizing our core innovations that drive customer success, building even stronger relationships with you, our investors. Our Q1 results show that we continue to make great progress. As I said in March, we’re just getting started with this incredible transformation. We continue to scrutinize every dollar investment, every resource, and every spend and we’re transforming every corner of our company.
Our progress over the last 5 months, while it’s very impressive and I cannot be more grateful to our entire team for their leadership. In fact, you may hear me say that several times on this call. Our transformation drove our Q1 financial results. As I said, on our last call, well improving profitability is our highest priority. As a result, we significantly exceeded our margin target for the quarter, delivering a non-GAAP operating margin of 27.6%, up 1,000 basis points year-over-year, incredible. And there’s no greater point of evidence to our transformation than this amazing result following the tremendous operating margin Q4. In Q1, we delivered 8.2 billion in revenue, up 11% year-over-year and 13% in constant currency. We had some amazing wins in the quarter with Northwell Health, Paramount, Siemens, Spotify, NASA, and the U.S. Department of Agriculture, among others.
We delivered 4.5 billion in operating cash flow up 22% year-over-year. Our remaining performance obligation ended the quarter at 46.7 billion, an increase of 11% year-over-year. And through Q1, we’ve now returned more than $6 billion in share repurchases. As a result for the third quarter in a row, we ended the quarter with fewer shares year-over-year another amazing point of evidence on this incredible transformation. Now, turning to our financial guidance, while the economy is not in our control, our margins are, which is why we’re raising our margin target for the full fiscal year. For FY 2024, we’re raising our non-GAAP operating margin to 28%, an improvement of 550 basis points year-over-year and we remain confident that we’ll hit 30% non-GAAP operating margins in the first quarter of fiscal year 2025.
We could not be more excited about our progress. We’re maintaining our fiscal year 2024 revenue guidance of approximately 34.5 billion to 34.7 billion over 10% projected growth year-over-year. I couldn’t be more proud of how our team has come together, stepped up, and delivered these results. I’ve also been asked numerous times this quarter by our investors and our customers, how we’re able to make so much progress so fast and deliver these incredible numbers? It’s very simple. It’s our Ohana culture. It’s our superpower. And again, I’d like to thank our amazing team for this incredible accomplishment. Last quarter, I told you of how our AI team is getting ready to launch Einstein GPT, the world’s first genitive AI for CRM. At TrailheadDX in March in front of thousands of trailblazers here in San Francisco, that’s exactly what we did.
At its foundation, Einstein GPT is open and extensible. Customers can connect to multiple large language models, including from partners like OpenAI and Tropic and others. This is a whole new way to work for our customers, users, and trailblazers. Users on Salesforce are seeing new AI generative features across all of their most common workflows. And while many of these will be created by Salesforce developers, far more will be created by our incredible trailblazer ecosystem. For low code of trailblazers, Einstein GPT will provide a toolset to design generative AI apps built on [reusable props] [ph]. For pro code trailblazers, Einstein GPT will offer an extensible ecosystem of LLM providers with configurable grounding. And Einstein GPT is the combination of tremendous research and engineering by our world-class AI team, and I’d like to congratulate them on this amazing result.
And one more amazing result, this week, Einstein, Salesforce Einstein that we’ve been talking about for so many years on these calls, will generate an incredible 1 trillion predictions for our customers, an incredible milestone on our AI journey. We saw more of the incredible work of our AI team at our New York City world tour this month when we demonstrated Slack GPT. Slack is a secure treasure trove of company data that generative AI can use to give every company and every employee their own powerful AI assistant helping every employee be more productive and transforming the future work. Slack GPT can leverage the power of generative AI to deliver instant conversation summaries, research tools, and writing assistance directly in Slack, and you may never need to leave Slack to get a question answered.
Slack is the perfect conversational interface for working with LLMs, which is why so many AI companies are Slack first and why OpenAI, ChatGPT, and Anthropic Squad can now use Slack as a native interface. Slack is also delivering integrated sales and service experiences powered by native GPT to be the best interface for all of our Salesforce customers and there’s a lot more magic to come with Slack and generative AI. In this month, we also announced Tableau GPT. At our Tableau conference, we had over 8,000 in-person attendees. Tableau GPT simplifies data analysis for all of our users enabling anyone to inquire about their data using Einstein GPT and obtain AI driven insights at scale. The intelligence and automation that Tableau GPT provides is tremendously important in this area of hyperscale data that we’re all entering.
The coming wave of generative AI will be more revolutionary than any technology innovation that’s come before in our lifetime or maybe any lifetime. Like Netscape Navigator, which opened the door, to a greater Internet, a new door has opened with generative AI and it is reshaping our world in ways that we’ve never imagined. Every CEO realizes they’re going to have to invest in AI aggressively to remain competitive and Salesforce is going to be their trusted partner to get them to do just that. Every CEO I’ve spoken with sees AI as a revolution beginning and ending with the customer, and every CIO I’ve spoken with wants more productivity, more automation, and more intelligence through using AI. A great example [of deploying] [ph] this technology is Gucci.
We’re working with them to augment their client advisors by building AI chat technology that creates a Guccified [indiscernible] service, well, incredible new voice, amplifying brands, storytelling and incremental sales as well. It’s an incredibly exciting vision for generative AI to transform which was customer service into now customer service, marketing, and sales, all through augmenting Gucci employee capabilities using this amazing generative AI, but we can only do all of this with trust. Our customers need to understand where their data is going and they must be able to maintain data integrity and access and privacy controls. Large customers must maintain data compliance as a critical part of their governance, while using generative AI and LLMs. This is not true in the consumer environment, but it is true for our customers, our enterprise customers who demand the highest levels of this capability.
Where customers who for years have used relational databases as the secure mechanism of their trusted data, they already have that high level of security to the row and sell level. We all understand that. And that is why we have built our GPT trust layer into Einstein GPT. The GPT trust layer gives connected LLM secure real time access to data without the need to move all of your data into the LLM itself. It’s an incredible breakthrough for our customers and working with LLMs in a secure and trusted way. While they’re using the LLMs, the data itself is not moving and being stored in the LLM. That is what our customers want. They can be sure that the customer data is where they know it is, where they can be assured that it is for their compliance and for their governance.
And I cannot be more excited about our AI CRM and delivering on this future of trusted AI through our new Salesforce GPT trust layer. Finally, I can’t talk about AI without talking about the success of our data cloud. Data Cloud is the heart of customer 360 and now our fastest growing cloud ever. Data Cloud created a real-time Intelligent Data Lake that brings together and harmonizes all of our customers’ data in one place. In Q1, we closed one of our largest healthcare industry deals ever with Northwell Health, New York’s largest private employer. They have 21 hospitals, 900 patient – 900 outpatient facility or ambulatory facilities, and their own medical school all in New York. By integrating DataCloud with Health Cloud, Tableau, MuleSoft, while our entire customer 360, Northwell is improving patient care by bringing together its vast data resources to create a single source of truth and using AI to govern data, use, and maintain regulatory compliance.
This is the future of our customers and our industry. It’s AI, plus data, plus CRM. And of course, this AI revolution is just getting started, which is why we’ve invested 250 million in our new AI venture fund to fuel startups developing our trusted generative AI vision. We’ll be talking more about this at our AI Day event on June 12th in New York City, and I hope that you’ll join me there. To wrap up, we’re transforming every corner of our company. We’re laser focused on our short-term and long term restructuring, improving productivity and performance, prioritizing our core innovations and delivering for our shareholders. As a result, productivity is up, profitability is up, revenue is up, cash flow is up, and we dramatically increased our margin guidance.
And just like the cloud, mobile and social well, AI, this revolution is a new innovation cycle. It’s going to be a new spending cycle as well, which is going spark a massive new tech buying cycle. And we’ve led the industry through each of these cycles and I couldn’t be more excited for our future as we continue on a path to our long-term goal to make Salesforce the largest most profitable enterprise software company in the world, and the number 1, safest and most trusted AI CRM. With that, Brian, I’ll turn it over to you.
Brian Millham: Thanks, Marc. As Marc said, we’re continuing our transformation across every part of our company. Our focus on performance culture and operational excellence contributed to our strong first quarter results. Since our last call, we’ve removed layers to get closer to our customers and to complexity out of our business to help us accelerate through the rest of the year. We clearly defined our return and remote office guidelines for our employees, and it’s been great to get together even more in our offices and with our customers around the globe. I had the chance to visit [many of our office] [ph] this quarter and the energy is incredible. As you heard from Marc, our transformation plan continues to deliver top and bottom line growth as we help our customers increase productivity, drive efficiency, and become AI First Companies.
But we’re still operating in an uncertain macro environment. Customers continue to scrutinize every deal, and we see elongated deal cycles and deal compression, particularly in our more transactional revenue streams like SMB, create and close, and self-serve. Also in Q1, our professional service business started to see less demand for multi-year transformations, and in some cases delayed projects as customers focused on quick wins and fast time to value. But for this reason, we saw strong performance from some of our fast time to value efficiency focused products with sales performance management, sales productivity, and digital service all growing annual recurring revenue above 40% in the quarter. As customers look to reduce complexity and achieve faster time to value, they’re expanding their adoption of Salesforce clouds, a key growth strategy for us.
The world’s most recognized companies are relying on Salesforce more than 90% of the Fortune 100 used Salesforce and they average more than five of our clouds. This is why we’re so excited about our AI plus data plus CRM strategy. As Marc explained, we’re building Einstein GPT and Data Cloud into every cloud and our Customer 360 and we’re perfectly positioned to help our customers harness the phenomenal power of AI. Our core offerings remain resilient. In Q1, 9 of our top 10 deals included sales, service, and platform. Industry clouds continue to be a tailwind to our growth, and we saw momentum with great customers like Northwell, USDA Rural Development, and NASA who we showcased at World Tour DC in April. Once again, eight of our industry clouds grew ARR above 50%.
I met with hundreds of customers in the quarter and we hosted 700 meetings in our innovation centers with our top customers and prospects. Generative AI is top of mind for all of them. As they look to benefit from the intelligence automation and cost savings that Salesforce is uniquely positioned to deliver. We’re seeing tremendous appetite for our new generative AI products starting with Einstein GPT, Slack GPT, and Data Cloud. Our generative AI products will be a catalyst for our future growth. As Marc mentioned, Data Cloud continues to be one of our fastest growing products and we had great wins in the quarter with companies like Major League Soccer and Giorgio Armani. Armani uses Data Cloud to deliver hyper personalized online and in-store experiences, real time engagement, and curated shopping recommendations.
We can see how Data Cloud and Einstein GPT are going to create experiences that weren’t possible before and really drive growth. In an environment where customers are optimizing their current [tech stacks] [ph], integration and automation continue to be efficiency drivers. MuleSoft again delivered strong results with wins at Siemens, [Cinnova] [ph], and Vodafone. For the first time, Salesforce was ranked number 1 in integration by market share in the latest IDC software tracker, a great testament to our MuleSoft team. Tableau is unleashing the power of our Data Cloud, unlocking customer data and delivering actionable real time insights. In the quarter, we had great wins at customers like Union Bank of the Philippines, Discovery Financial Service, Moderna, ADT Solar, and Alaska Air.
We’ve made great investments to reaccelerate Tableau, including new leadership along with product innovations like Tableau GPT, and revenue intelligence, now one of our fastest growing add-ons. I’m really encouraged by the Slack team who has created an ambitious product roadmap with generative AI at the center. In Q1, we saw amazing momentum with customers like the California Office of Systems Integration, Paramount Global, Breville, and OpenAI, and rolled out an AI ready platform, Slack Canvas, and app integrations with ChatGPT in Anthropic’s Claude. Overall, I could not be more thrilled with our offerings and the market position, especially as it relates to delivering on the promise of AI. We’re looking forward to continuing the energy and momentum at our AI day in just a couple of weeks.
I’m very proud of the teams and of our partners. Their focus on customer success continues to be outstanding. As Marc said, our productivity is up, profitability is up, revenue is up, cash flow is up. We’re increasing our margin guidance and sales forces leading the way as the number one AI CRM. Now, over to you Amy.
Amy Weaver: Thank you, Brian. As Marc said, a key part of our transformation to profitable growth is short and long-term restructuring of the company. We have now largely completed the restructuring announced in January, and we’re completing our comprehensive operating and go to market review. As we shift to the implementation phase, we’re executing against three key pillars, optimization of resources and organization structure, product investment prioritization, and operational rigor. We continue to view sales and marketing and G&A as the primary drivers of leverage. While R&D remains an important investment area. Our profitable growth framework, disciplined capital allocation strategy, and opportunity to drive shareholder value are represented in our actions and in our results.
Now, turning to our results for Q1’s fiscal year 2024, beginning with top line commentary. For the first quarter, revenue was 8.2 billion, up 11% year-over-year or 13% in constant currency with the beat primarily driven by strong momentum in MuleSoft, and more resilient core performance. Geographically, we saw strong new business growth in parts of EMEA and LatAm, specifically Switzerland, Italy, and Brazil, while we experienced continued pressure in the United States. In Q1, the Americas revenue grew 10%, EMEA grew 12% or 17% in constant currency. And APAC grew 16% or 24% in constant currency. From an industry perspective, manufacturing, automotive, and energy all performed well, while high-tech and financial services remained under pressure.
Q1 revenue attrition ended the quarter at approximately 8%. As expected, we saw a modest increase in Q1. Partially attributed to the inclusion of Tableau in the metric. We also noted some incremental weakness in our marketing and commerce attrition. As Marc said, non-GAAP operating margin finished strong in Q1 at 27.6%, driven by our disciplined investment strategy and accelerating our restructuring efforts. Q1 operating cash flow is 4.5 billion, up 22% year-over-year. This includes a 910 basis points headwind from restructuring. Q1 free cash flow was 4.2 billion, up 21% year-over-year. Turning to remaining performance obligation or RPO, which represents all future revenue under contract. This ended Q1 at 46.7 billion, up 11% year-over-year.
Current remaining performance obligation or CRPO, ended at 24.1 billion, up 12% year-over-year in both nominal and constant currency, ahead of expectations driven by strong core performance, partially offset by continue, create, and close softness. And finally, we continued to deliver on our capital return commitment. In Q1, we returned 2.1 billion in the form of share repurchases bringing the total returned to more than 6 billion since the program was initiated last August, representing more than 38 million shares. Before moving to guidance, I wanted to briefly touch on the current macro environment that Brian discussed. The more measured buying behavior persisted in Q1. And as Brian noted, in Q1, we started to see weakness in our professional services business.
We expect these factors to persist, which is incorporated in our guidance. Let’s start with fiscal year 2024. On revenue, we are holding our guidance of 34.5 billion to 34.7 billion, representing over 10% growth year-over-year in both nominal and constant currency. The strength in our Q1 performance is offset by the pressure in our professional services business previously discussed. For fiscal year 2024, we are raising non-GAAP operating margin guidance to 28%, representing a 550 basis points improvement year-over-year. This guidance increase is driven by the acceleration of our restructuring efforts and also includes reinvestment in targeted areas, namely in R&D. I’m proud of our progress and remain confident in our trajectory as we progress towards our 30% non-GAAP operating margin target in Q1 2025.
We also remain focused on stock based compensation and continue to expect it to improve this year to below 9% as a percent of revenue. Before moving to EPS, on restructuring, we now expect the charges in FY 2024 to come in towards the higher end of the range previously provided in our last earnings release. As a result of these updates, we now expect fiscal year 2024 GAAP EPS of $2.67 to $2.69, including estimated charges for the restructuring of a $1.11. Non-GAAP EPS is now expected to be $7.41 to $7.43. And we are raising our fiscal year 2024 operating cash flow growth to be approximately 16% to 17%, which now includes a 14 point to 16 point headwind from restructuring. As a reminder, we will see an increase in our cash taxes in fiscal 2024 as we draw down our remaining net operating losses.
CapEx for the fiscal year is expected to be slightly below 2.5% of revenue. This results in free cash flow growth of approximately 17% to 18% for the fiscal year. Now to guidance for Q2. On revenue, we expect $8.51 billion to $8.53 billion, growth of approximately 10% in both nominal and constant currency. CRPO growth for Q2 is expected to be approximately 10% year-over-year in nominal and constant currency. Our guidance incorporates the momentum of our execution in Q1, offset by the persistent measured buying behavior and a decline in professional services fixed fees contribution. The professional services impact represents approximately a 1 point headwind to growth. For Q2, we expect GAAP EPS of $0.79 to $0.80 and non-GAAP EPS of $1.89 to $1.90.
And as we focus on shareholder return and disciplined capital allocation, we continue to expect to fully offset our stock based compensation dilution through our share repurchases in fiscal year 2024. In closing, we continue to transform every corner of the company. We are hyper focused on delivering the next wave of innovation led by Data Cloud and Einstein GPT. And Salesforce is well-positioned to remain the market leader in this new AI first world. We are committed to delivering long-term shareholder value, and I personally want to thank our shareholders for their continued support. Now, Mike, let’s open up the call for questions.
Mike Spencer: Thanks, Amy. Operator, we’ll move to questions now. I ask that everyone only ask one question in respect for others on the call. In addition, I’d like to introduce Srini Tallapragada, our Head of Engineering, who will be joining us for Q&A today. With that Emma, let’s move to the questions.
Q&A Session
Follow Salesforce Inc. (NYSE:CRM)
Follow Salesforce Inc. (NYSE:CRM)
Operator: Thank you. [Operator Instructions] Your first question today comes from the line of Kirk Materne with Evercore. Your line is open.
Kirk Materne: Hi, yes. Thanks very much and congrats on a good start to the year. Marc, you’ve been through a number of cycles from a technology perspective. I was just kind of curious where you think we are in terms of people investigating AI versus when the spending cycle around it might kick-in? Can you just give us an idea of, you know, sort of your thoughts on that and really just the opportunity for you all to monetize AI with your product base? Thanks.
Marc Benioff: Well, I think this is the absolute question of the day, which is we are about to enter an unbelievable super cycle for tech and everyone can see that. This is an incredible opportunity for not only Salesforce, but our entire industry. I mean, perhaps only a year ago or less than a year ago, no one on this call even knew what GPT was. Today, ChatGPT is the fastest growing consumer product of all time, and has transformed many, many lives. It’s definitely not just the technology of this lifetime, but maybe any lifetime. It’s an incredible technology. And every company is going to have to transform because every company is going to have to become more productive or automated more intelligent through this technology to be competitive with other companies.
And just yesterday, I’m in a room here at the top of Salesforce Tower on the 60th floor, and we have the CEO of a very large bank here. And like every other sales call I’ve made in the last quarter, there’s only one thing that customers want to talk about, and that’s artificial intelligence and specifically, generative AI. Of course, we have been a leader in this area with Einstein, more than 1 trillion transactions delivered this week, but these are primarily predictive transactions built on machine intelligence, machine learning, and deep learning. But in 2018, deep learning evolved and became much more sophisticated and became generative as these neural networks expanded their capabilities and also the hardware went to another level as well.
So, now we have this incredible new capability. It’s a new platform for growth, and I couldn’t be more excited. But yesterday, there were many questions from my friend who I’m not going to give you his name because he’s one of the – the CEO of one of the largest and most important banks in the world. And I’ll just say that, of course, his primary focus is on productivity. He knows that he wants to make his bankers a lot more successful. He wants every banker to be able to rewrite a mortgage, but not every banker can, because writing the mortgage takes a lot of technical expertise. But as we showed him in the meeting through a combination of Tableau, which we demonstrated and Slack, which we demonstrated, and Salesforce’s Financial Services Cloud, which he has tens of thousands of users on, that banker understood that this would be incredible.
But I also emphasize to him that LLMs, or large language models, they have a voracious appetite for data. They want every piece of data that they can consume, but through his regulatory standards, he cannot deliver all that data into the LLM because it becomes amalgamated. Today, he runs on Salesforce, and his data is secure down to the row and cell level. He knows that readers don’t block [riders] [ph] that there’s all types of security provisions and regarding who can see what data about what account or what customer. And when you put it into an LLM, those permissions are not understood. So, that is a very powerful moment to realize that the way that LLMs operate is in a way state where they’re kind of consuming all this data and then giving us that information back out, well, that Salesforce’s opportunity.
That’s why we built this GPT trust layer. And through the GPT trust layer and rebuilding all of our apps, including Slack and Tableau, but as we demonstrated him yesterday, a new Sales Cloud, a new Service Cloud, a new marketing cloud, and what we’ll show on June 12 in New York City, a complete reconceptualization of our product line. What that means for this customer and for every customer is that they have an opportunity to transform their business. And for Salesforce, that also means an opportunity to transform ourselves and for our industry, a new super cycle where every company will have to transform to be AI first.
Operator: Your next question comes from the line of Keith Weiss with Morgan Stanley. Your line is open.
Elizabeth Porter: Great. This is Elizabeth Porter on for Keith Weiss. Thanks for the question. I wanted to ask on the potential disruption from rebooting the sales enablement process. Are we past the point of seeing disruption or could that be a future risk? And if so, how is it included in guidance. The CRPO guidance for 10% looks like a bit of a slowdown despite the easier comp. And Amy, you called out pro services a one-point headwind. But just any other factors we should keep in mind that may create a challenge over the next couple of months? Thank you.
Marc Benioff: Well, I’ll tell you that. I think that as you know, in Q1, we went through tremendous disruption with human resources in our company, and it was very disruptive to all of our Ohana. And I’m so grateful to them for how they supported the whole company, all the customers and themselves during what was probably one of the most disruptive quarters that I’ve seen and yet we delivered these incredible numbers and this incredible technology vision going forward. In terms of enablement of the sales organization, its ability to kind of move forward, that is not, I would say, a material part of what happened in the quarter or what’s going to happen for the year. Our sales organization remains with a very high level of productivity, but let me turn it over to Brian to speak directly to his strategy on delivering the year.
Brian Millham: Yes, Marc, thank you. I appreciate it. And Elizabeth, thank you for the question. I think you’re referencing some comments we made on previous calls about enablement being an important strategy for us as we saw during the pandemic, not as many of our AEs and SEs and leaders were as enabled as we would like. We’ve made those changes, and we’ve really invested in the time to make sure our AEs understand our product portfolio, the entire customer 360, and we’re on sort of the next generation of enablement. As Marc just talked about, this new AI wave is going to create a huge opportunity for us. And we need to make sure that we’re investing in the enablement to bring our teams along. It’s been a very short window around this innovation, and we’ve got some work to do on this, but we’re very, very excited with our path forward, our position in the market.
All that we’re doing with our customers, the demand we’re feeling from our customers. Marc mentioned it, and I had the same experience, every CEO in the world is talking to us about generative AI right now, and we are investing heavily to make sure our account executives, our sales teams, in fact, the entire company is able to articulate our value proposition to our customers. So, Amy, I don’t know if you have any further comments there?
Amy Weaver: Sure. Elizabeth, you mentioned CRPO in professional services, so let me jump in on that. For our guide for this next quarter, we are seeing some pressures from the macro situation and then also specifically from professional services. And there’s a bit of a nuance with ProServ that I want to make sure people understand. So, if you back up, our customers can contract for professional services in two ways, either on a time and materials basis, which is typically used for smaller projects or on a fixed fee, kind of milestone basis. For purposes of CRPO, we only include projected revenue from fixed fee deals. One of the things that we are seeing right now is not only a professional services as a whole same pressure, but more customers are choosing to contract on a time and materials basis, which is not included in our CRPO.
So, as a result, we’re seeing, kind of a double pressure there. And I’m expecting a full one-point headwind to CRPO for the quarter from professional services.
Mike Spencer: Thanks Elizabeth. Emma, let’s move to the next question please.
Operator: Your next question comes from the line of Brad Sills with Bank of America. Your line is open.
Brad Sills: Oh, wonderful. Thanks. I wanted to ask a question to Brian, I think, here on the efforts here to improve productivity. You mentioned removing some layers here. My question is, we think of all these actions that you’re taking as drivers of margin expansion, but are you starting to see some early traction here on the sales productivity front, such that perhaps that’s driving some upside here across the business, perhaps larger deals now that you’re seeing coming out of the field and pipeline and some of the deal closure? Thank you so much.
Brian Millham: Thanks, Brad, for the question. I really appreciate it. As you know, we’re operating in a constrained environment right now. And so, we are really focused on this productivity measure and metric for our organization right now, investing heavily, as I mentioned earlier, and the enablement part of our organization. Also looking at other ways to drive productivity. And one of the things that we’re talking quite a bit about right now is pricing and packaging, bringing together logical products that we can be selling in a single motion versus our go-to-market, which is largely aligned by product., how do we focus on a larger average deal size for every transaction, and so big investments on that front, really a strong focus on productivity as it relates to moving people up market as well.
We’re thinking about self-serve in the bottom end of our market. How do we drive a self-serve motion, automated motion at the low end of our market to bring our account executives upmarket to drive higher productivity in the sales organization? So clearly, a big motion for us right now. Feel very good about our big deal motion. Actually in Q4, we saw some – sorry, in Q1, we saw some very good big deal execution from the team. That is not really an area that has held us back. We feel very good about our ability to transform companies and transact these large businesses. It really is the velocity business that has held us back a bit on our create and close some of the SMB transactions. So, we have a clear focus in this area to drive the productivity with our plans going into Q2 and beyond into Q4.
Mike Spencer: Thanks, Brad. Emma, next question please.
Operator: Your next question comes from the line of Brent Thill with Jefferies. Your line is open.
Brent Thill: Amy, regarding Americas, that was a pretty large decel, one of your slowest growth quarters, I think, ever in Americas. The rest of the world did decel, but maybe not quite as the magnitude of the Americas. Can you just speak to what happened there in that region?
Amy Weaver: Sure. So thanks, Brad, for the question. The Americans did see a deceleration, a 10% year-on-year revenue growth, compared to 17% in EMEA and about 24% in nominal APAC. We are continuing to see most of the pressure in North America. There were some real pockets of acceleration in EMEA and in LatAm, particularly in Switzerland, I think Brazil, Italy. So, we are seeing some good things, but North America has taken the brunt of the deceleration. Brian, do you want to come in and see if you can address that in more detail?
Brian Millham: Sure. Yes. I think when we think about our business from an industry perspective, we have a very nice footprint of our great technology companies and financial services company, both of which were a bit slower than we would have liked in the Americas in Q1. And so, as we think about the all-in size of our Americas business, those industries felt a little bit more of the economic headwinds in the quarter in Q1. And so, I think a bit of a slowdown from that perspective is a result you’re seeing in the Americas business.
Mike Spencer: Thanks, Brent. Emma, next question please.
Operator: Your next question comes from the line of Mark Murphy with JPMorgan. Your line is open.
Mark Murphy: Thank you very much. And I’ll add my congrats. So Marc, it feels like the tech and software industry has had a recession without the broader economy being in a recession quite yet, and that’s very unusual. Do you think with all the purging and optimizing of IT budgets, which is already taking place, plus Salesforce’s headcount optimization already being underway that perhaps the next recession might actually be more manageable or easier to navigate than what you had seen in some of the prior cycles?
Marc Benioff: Well, I think that this is a great question. And I tried to address it on the last call. I just really think you have to look at 2020, 2021 was just this massive super cycle called the pandemic. I don’t know if you remember, but we had a pandemic a couple of years ago. And during that, we saw tech buying like we never saw. It was incredible and everybody surged on tech buying. So, you’re really looking at comparisons against that huge mega cycle. And that is what I think is extremely important to understand, the relative comparisons. And that is where my head is at, which is I am constantly comparing against what happened in 2021, but also looking at 2020 and 2019. That’s a little bit different than 2008 and that’s a little bit different than 2001.
We didn’t exactly have these huge mega cycles that kind of we were exiting. And I – that’s also what gives me tremendous confidence going forward and what we’re really seeing is that customers are absorbing the huge amounts of technology that they bought. And that is about to come, I believe, to a close. I can’t give you the exact date, and it’s going to be accelerated by this AI super cycle.
Mark Murphy: Thank you.
Mike Spencer: Thanks, Mark. Emma, next question please.
Operator: Your next question comes from the line of Brent Bracelin with Piper Sandler. Your line is open.
Brent Bracelin: Good afternoon. I wanted to circle back to the generative AI discussion, if we could. I totally understand how large enterprises are turning to Microsoft, given the productivity tools and suite that they have, but as you start to engage with customers, what’s resonating relative to the Salesforce Gen AI journey? Is it the data layer and Customer 360 messages resonating? Is it the app layer around sales automation functionality that you’re going to offer? Just double quick on what customers are coming to Salesforce and engaging the you around some of the new things that we’ll hear about it sounds like in June.
Marc Benioff: Well, I think that when you look at our artificial intelligence strategy, which we’re talking to the largest, most important companies and governments in the world, it has to be architected around security. It has to be architected around compliance, around trust. It has to be architected around governance. And this is very important. And of course, we’re also architecting it around being open. That is, we’re working with many AI companies to provide the best solutions for our company. Of course, we have a tremendous relationship with OpenAI. We also just invested in Anthropic [indiscernible] many of these companies. But I think ultimately, this is going to be a solution that enterprise customers are going to come in and make sure that their data is protected.
And it’s also protected down at the user level. And Srini, do you want to come in and talk about exactly what we’re doing to make sure that we’re delivering the best possible solutions for our customers for AI?
Srini Tallapragada: Yes, Marc. So, I think I met about 70 customers in the last quarter. And like Marc was saying, the only conversation everybody is interested is on – and while everybody understands the used cases, they’re really worried about trust. And what they are looking for us is guidance on how to solve that. For example, so we are doing a lot of things as the basic security level, like we are really doing tenant level isolation coupled with zero retention architecture, the LLM level. So the LLM doesn’t remember any of the data. Along with that, they – for them to use these used cases, they want to have – they have a lot of these compliances like GDPR, ISO, SOC, [Quadrant] [ph], they want to ensure that those compliances are still valid, and we’re going to solve it for that.
In addition, the big worry everybody has is, people have heard about hallucinations, toxicity, bias, this is what we call [model trust] [ph]. We have a lot of innovation around how to ground the data on 360 data, which is a huge advantage we have. And we are able to do a lot of things at that level. And then the thing which I think Marc hinted at, which is LLMs are not like a database. These intra-enterprise trust, even once you have an LLM, you can’t open the data to everybody in the company. So, you need ability to do this – who can access this data, how is it doing both before the query and after the query, we have to build that. And then we have to be not only open, but also optimized. We are running an open – the way we’ll run is, we’ll run like a model [indiscernible] because one of the things everybody has to watch out is it’s great, but what about the cost to serve, not all models are equal.
So, we are going to run this and pick very – we are going to pick a very cost-optimized curve, so the value is very high. And our Salesforce AI research has a lot of sales for state-of-the-art models and industry cases, which we are optimizing to run at very low cost and high value. Add to that, we’ve got the Trailblazers platform, which allows low code, high code, and many other things, and we’re going to optimize sort of jobs to be done for each industry and jobs. That’s really what they’re looking for because they have been using our AI platform. Like Marc mentioned, we already do 1 trillion transactions per day. And by the way, the data cloud, just in a month, we are importing more than 7 trillion records into the data layer, so which is a very powerful asset we have.
So, coupled with all of this is what they are looking for guidance and how we think we can deliver significant value to our customers.
Marc Benioff: Srini, I want to ask you a question. In January, you published a paper in nature from your research team, which was called large language models, generating functional protein sequences across diverse families, and you really showed something amazing, which was that deep learning language models have shown this incredible promise that you just articulated in various biotechnological applications, including protein design, engineering, and you also described very well one of our models that we’ve created internally, ProGen, which was a language model that can generate protein sequences with predictable function across large protein families. I was very impressed with that. And the entire research team deserves a huge amount of congratulations.
So, when you look at that, especially dramatically and semantically correct natural language sentences for diverse topics or how you’re going to use that inside our platform against other models that you’re seeing like Llama, OpenAI’s model, Anthropic and others, when will Salesforce use our own models like [CoGen] [ph], ProGen, T-code, our lit model, when will we use an outside commercial model like an OpenAI or an Anthropic? And when will we go to an open source model like we’ve seen emerge so many of those, including like Llama.
Srini Tallapragada: Yes. I think you hinted something very important. I think, as you know, Marc, we have – our I research team is one of the best-in-class model – state-of-the-art models from different areas. The way we are thinking of it is like anything else, where the world is going to go, which we strongly believe is going to be multiple models. And depending on the used case, you will pick the right models, which will provide you the value at the lowest cost. Where we have to run with highly regulated industries, where the data cannot leave the trust boundary or where we have significant advantage, where we can train on industry-specific data or Salesforce-specific – 360-specific data, like, for example, our FX model are helping our customers implement or our flow, we will use our internal model.
Where we need more generated image models or something where it needs public image databases, we may use a coherent or an OpenAI. It depends on the use case and which is why, at a given request, a secure trusted gateway will decide smartly which is the best used case, which is the model, and we always keep running the [indiscernible], which is what I mean. So today, one particular model may be good. Tomorrow, something else will come, and we’ll behind the team flip it, but our customers don’t need to know that. We will handle all of it. We’ll handle the model trust. We’ll handle all the compliances and all behind the scenes. And this is always what we promise to our customers, we’ll always future-proof. That’s the Salesforce promise to our customers so that they can focus on the business used cases.
Marc Benioff: So just one last follow-up question. You’ve described very well the GPT trust layer, which I think is going to be a significant amount of value added that we’re going to provide to our customers that’s going to be quite amazing. And then you develop these specific grounding techniques, which are going to allow us to keep our customers’ data safe and not be consumed by these voracious large language models, which are so hungry for all of our customers’ data. What is going to be the key to actually delivering this now across regulated industries?
Srini Tallapragada: I think the key is innovations we are doing, which people will see starting next month is around what we call [from generation] [ph] and grounding. These are techniques, which we’ll have to do, but it will work only because we have – all of this as based on underlying data. We have the Data Cloud, where we have all the 360 data, which is there. So, we’re able to ground these models and do it. So, there are a lot of other techniques, which are very technical, which we put it on our block. But that’s the innovation that we’re doing. And you have to remember that Salesforce also is a metadata model. So, we have a semantic understanding of what our customers are trying to do. We’re going to leverage the Metadata platform and do this grounding automatically for our customers, of course, while keeping the trust. That’s the base line.
Marc Benioff: Absolutely. Thank you so much, Srini.
Mike Spencer: Emma, next question please.
Operator: Your next question comes from the line of Raimo Lenschow with Barclays. Your line is open.
Raimo Lenschow: Hi, thank you. Question for Amy or Brian maybe more. The improvement in profitability or the raised guidance for profitability and cash is that all timing? Can you talk a little bit about that? Is it just timing or are there other factors we should consider in here? Thank you.
Amy Weaver: So Raimo, well I’ll start and then I can turn it over to Brian for a little bit more color. So, in terms of the great Q1 that we just saw, really pleased to see us coming in at 27.6% and also really pleased about the 28% – [the raise] to 28% for the full-year. What really drove the 27.6 was two things. It were the actions that we took that we announced in January with the restructuring. Executing on that, as well as having a very disciplined reinvestment strategy, and that led to that. And that’s also where we’re going to see this going through the rest of the year, driving the expansion 28% and then also putting us on track for the 30% margin in Q1 of next year. As I look through overall at transformation, I would really divide it into two stages benefits that we’re getting from that initial transformation.
And again, that’s what you’re seeing in Q1 and this year. And then the second stage, which is really as we’ve been going through this comprehensive operating and go-to-market review, that review is going to enable the second phase of our transformation, and that’s something that’s going to be ongoing and long-term over the next few years. You’ll see benefits to our margin in outer years beyond FY 2024. Brian, anything you would add?
Brian Millham: Yes, thanks for the question. When we think about longer-term structures, we obviously took the action in Q1. But longer term, we’re looking at things like how do we leverage comp plan redesign to drive better efficiencies in our organization going forward. How do we continue to look at self-serve at the low end of the market to drive better efficiencies in our organization. So, resellers as a potential investment that we’ll make in emerging markets is long-term leverage on the efficiency gains. So lots of things that we’re doing that will be in sort of the Phase 2 oriented around process improvement and systems improvement. And again, as I mentioned, top plan design that will drive better efficiencies in the organization.
Mike Spencer: Thanks, Raimo. Emma, let’s go to next question please.
Operator: Your next question will come – is from Karl Keirstead with UBS. Your line is open.
Karl Keirstead: Okay. Great. I’ll direct this to Amy as well. Amy, congrats on that margin improvement. I’ve got a two-parter both related to margins. First, what is the timing of the receipt of that Bain operational review that might ostensibly kick off the second phase of cost cutting? And then secondly, you and Brian talked about this reinvestment in R&D and investing heavily around AI. I’m wondering if those planned investments are greater than you anticipated when you initially set the guidance three months ago, such that you need to run a little bit harder on OpEx management to offset it and keep delivering on your stated margin targets? Thanks so much.
Amy Weaver: Great. Thanks, Karl. So first on the timing. As I mentioned, we’ve been doing this end-to-end comprehensive operating and go-to-market review. The entire company has been involved in that. There’s really no stone unturned. We’re getting close to the end of that process, and then we will be moving into the implementation. You’ll be hearing more about that in future quarters. Turning to reinvestment. We are keeping a very close eye on reinvestment, very excited particularly about artificial intelligence. So, much of what Srini has been talking to you about, I don’t view this as a greater investment from what we were looking at earlier. We’re really going along with our current plans. We are looking at operating expenses management, and we’re looking at it seriously every day, but that’s not something that has changed.
Mike Spencer: Thanks, Karl. Operator, we’ll move to our last question now, please.
Operator: Our last question comes from the line of Kash Rangan with Goldman Sachs. Your line is open.
Kash Rangan: Hi, thank you very much team. Congratulations on putting up terrific operational results, and a good cash flow, good margins, et cetera. Marc, you talked about a super cycle of buying and technology in the years ahead. Can you just parse for us, if you don’t mind, what is new about generative AI as far as Salesforce as opportunities are concerned, netting out against what Einstein has been able to accomplish for you – for the company? And how does it show up in the product in terms of productivity? What are the scenarios by which customers can experience this amazing productivity? And how can you charge more for delivering that, kind of value? Thank you so much.
Marc Benioff: Well, thanks, Kash, for giving me the opportunity to talk about our AI vision, and I’m also going to ask Srini again to fill in some of the details. But I think it started to occur to me – I think folks know, I have – my neighbor is Sam Altman is the CEO of OpenAI, and I went over to his house for dinner, and it was a great conversation as it always is with him. And he had – he said, Oh, just hold on one second, Marc, I want to get my laptop. And he brought his laptop out and give me some demonstrations of advanced technologies that are not appropriate for the call. But I did notice that there was only one application that he was using on his laptop and that was Slack. And the powerful part about that was I realized that everything from day 1 at OpenAI have been in Slack.
And as we kind of brainstormed and talked about – of course, he was paying a Slack user fee and on and on, and he’s a great Slack customer. We’ve done a video about them, it’s on YouTube. But I realize that taking an LLM and embedding it inside Slack, well, maybe Slack will wake up. I mean there is so much data in Slack, I wonder if it could tell him what are the opportunities in OpenAI? What are the conflicts, what are the conversations? What should be his prioritization? What is the big product that got repressed that he never knew about? And I realized in my own version of Slack at Salesforce, I have over 95 million Slack messages, and these are all open messages. I’m not talking about closed messaging or direct messaging or secure messaging between employees.
I’m talking about the open framework that’s going on inside Salesforce and with so many of our customers. And then I realized, wow, I think Slack could wake up, and it could become a tremendous asset with an LLM consuming all that data and driving it. And then, of course, the idea is that is a new version of Slack. Not only do you have the free version of Slack, not only do you have the per user version of Slack, but then you have the additional LLM version of Slack. And for each one of our products in every single one of our categories, there’s that opportunity to upsell and cross-sell into the next version of generative AI, not just with Slack, but you can also imagine, for example, even with Salesforce, the ability as we’re going to see in June, that many of our trailblazers are amazing low-code, no-code trailblazers, but soon they’ll have the ability to tap in to our LLMs like ProGen and Cogen that have the ability to code for them automatically.
They aren’t coders. They didn’t graduate computer science degrees. And if they need to write a sophisticated Apex code or other code, it can be a challenge for them, but because you know what is there only 8 million or 10 million coders in the whole world – but now with LLMs, everybody can start to code. That’s an amazing productivity and augmentation of everybody’s skill set. And that’s a great way to look at what could happen, for example, with our core products, but even with Tableau, which has tremendous programmatic engine as well or even MuleSoft, which is a highly programmatic product that then coupled with an LLM can have the ability to go forward. But of course, those LLMs are highly trained models for those specific types of code, and then that is something that we would add on either through partnership or through our own LLM, as Srini described, it’s another layer of value that we can provide to our customers.
In all cases, customers are going to be more productive. They’re going to be more automated, and they’re going to be more intelligent. And as we look at some of the examples that we’ve given like at the New York World Tour, you saw our Marketing Cloud do something very cool that it couldn’t do even just 6 months ago. It segmented the database on its own. It wrote an e-mail on its own. Of course, it required editing, and it also built a landing page on its own. That was amazing. Or as we saw at the Tableau conference, we saw Tableau being able to create its own visits or visualizations that was incredible. And what we saw at our Trailhead DX, we saw Einstein GPT which started to do these amazing next-generation things. And I think in each of these areas, we can offer more value, but we must do it in the auspices of trust, data integrity and governance.
And that is what we have been working on now for a considerable amount of time. Of course, we’ve led – we have always wanted to be the Number 1 AI CRM. And we are, if you look at Einstein’s transaction level, I think that that’s enough evidence right there. But I think this idea of generative AI, this starts to reconceptualize every product and we will start to build and develop not only extensions to all of our current products, but entirely new products as well. And we have a lot of exciting ideas of things that we can do to help our customers connect with their customers in a new way using generative AI. Srini, do you want to come in and talk about that?
Srini Tallapragada: Thanks, Marc. So, I think the way I see it is this AI technologies are a continuum that is predictive then they generate, and the real long-term goal is autonomous. The initial version of the generative AI will be more in terms of assistance. And like Marc was saying, we are seeing like the most common used case everybody understands implicitly is self-service bots or in the call center or agent-assistant assistance, which I think really helps productivity. But the other used cases, which we are going to see, and in fact, I have rolled out our own code LLMs in our engineering organ, we are already seeing minimum 20% productivity. And in those cases…
Marc Benioff: Well, that’s a very key point. Isn’t it? That you’re seeing a 30% productivity increase in your own engineers using our own LLM.
Srini Tallapragada: 20%, we are seeing minimum. In some cases, up to 30%. Now, a lot of our customers are asking the same. We are going to roll Einstein GPT for our developers in the ecosystem, which will not only help not only the local developers to bridge the gap, where there’s a talent gap, but also reduce the cost of implementations for a lot of people. So there’s a lot of value. This assistant model is where we’ll see a lot of uptick. And then I think the fully autonomous cases, for example, in our own internal used cases with our models, we are able to detect 60% of instance and auto remediate. That requires a little bit more fine-tuning and we’ll have to work with specific customers to get to that level of model performance.
So, I see this is just the start of this [cut] [ph]. The resistant model is the initial thing to build trust and a human in the loop and validate it. And then as the models get better and better, we’ll keep taking used cases where we can fully automate it.
Marc Benioff: And address this one issue that a lot of customers come in like they did yesterday, and they tell us they think they’re just going to take all of their data, all their customer data, all of their information and put it into an LLM and create a corporate knowledge base, and it’s going to be one amalgamated database. Why is that a false prophecy?
Srini Tallapragada: Because even today, any example you see, even though we have hundreds of Slack channels, there are a lot of specific Slack channels, which only you want access to. You don’t want that. LLM doesn’t know. There is no concept of – it combines all this information. So, unless you put the layer both before who can access the data and then when it generates response, what he can do, you don’t want one wealth manager to generally generate a report, an account report where you’re mixing customers’ balances. So there are a lot of trust issues we have to solve. So, LLMs are good for a lot of very creative generative used cases, initially, where it’s public data that everybody can use it. Those are used cases. I think there is enough of low-hanging fruit in the initial phases with assistant model, which we’ll solve.
The really complex automated cases, the role level, record level sharing, we have a lot of techniques, which we are developing, which we will do. It’s also a research area, too. That one, I think we should be tempered with expectations, but there’s enough of, like I said, the develop, for example, I gave product example there’s enough of productivity which we will get.
Marc Benioff: Well, we’re really excited to show all of this technology at our AI Day on June 12 in New York City. And then also when we get to [Dreamforce GPT] [ph], we’re going to have an incredible demonstration of this technology.
Mike Spencer: So with that, we want to thank everyone for joining us today, and we look forward to seeing everyone over the coming weeks. Have a great one.
Operator: This concludes today’s conference call. You may now disconnect.