Oracle Corporation (NYSE:ORCL) Q1 2025 Earnings Call Transcript September 9, 2024
Oracle Corporation beats earnings expectations. Reported EPS is $1.39, expectations were $1.33.
Operator: Hello, and welcome to the Oracle Corporation Q1 Fiscal Year 2025 Earnings Call. All lines have been placed on mute to prevent any background noise. After the speaker’s remarks, there will be a question-and-answer session. [Operator Instructions] I would now like to turn the conference over to Ken Bond, Head of Investor Relations. You may begin.
Ken Bond: Thank you, Sarah, and good afternoon, everyone, and welcome to Oracle’s First Quarter Fiscal Year 2025 Earnings Conference Call. A copy of the press release and financial tables, which includes a GAAP to non-GAAP reconciliation and other supplemental financial information can be viewed and downloaded from our Investor Relations website. Additionally, a list of many customers who purchased Oracle Cloud Services or went live on Oracle Cloud recently will be available from the Investor Relations website. On the call today are Chairman and Chief Technology Officer, Larry Ellison; and Chief Executive Officer, Safra Catz. As a reminder, today’s discussion will include forward-looking statements, including predictions, expectations, estimates or other information that might be considered forward-looking.
Throughout today’s discussion, we will present some important factors relating to our business, which may potentially affect these forward-looking statements. These forward-looking statements are also subject to risks and uncertainties that may cause actual results to differ materially from statements being made today. As a result, we caution you against placing undue reliance on these forward-looking statements, and we encourage you to review our most recent reports, including our 10-K and our 10-Q and any applicable amendments for a complete discussion of these factors and other risks that may affect our future results or the marketplace of our stock. And finally, we are not obligating ourselves to revise our results or these forward-looking statements in light of new information or future events.
Before taking questions, we’ll begin with a few prepared remarks. And with that, I’d like to turn the call over to Safra.
Safra Catz: Thanks, Ken, and good afternoon, everyone. Before I go to our Q1 numbers, I thought I’d take only a moment to review some of the things that you’ll be hearing about over the next couple of days from Oracle. We are at Cloud World in Las Vegas, and Cloud World is where we come together with our customers and partners to share experiences and showcase our latest products and services. Our customers are our best folks people when they share how our technologies transform their enterprises. The innovations from our labs and research centers in combination with feedback from our customers have helped us build superior products and services. You’ll hear about new cutting-edge features within OCI, database, analytics, Fusion, NetSuite and our industry applications.
We will also be showing new capabilities that we’ve been working on for a while, including embedded AI agents infusion and those drive productivity and efficiencies for our customers when they’re rolled out. And as an Oracle Fusion customer myself, I take great pride once again in my team’s ability to have us announce earnings and give guidance nine days after the quarter ended. Many, many of our customers ask us how to replicate our results. And at Cloud World, we will be having a lot of Oracle playbook conversations this week. You’ve already seen today’s announcement of our partnership with Amazon Web Services, which has now joined Microsoft Azure and Google Cloud in making OCI and Oracle available in their respective clouds. Needless to say, we think our multi-cloud strategy will expand the ubiquity and popularity of our differentiated technologies, especially the Oracle Database.
Larry will share more details in just a moment. But now to Q1, which was clearly another outstanding quarter, with total revenue at the high end of my guidance and earnings per share of $0.04 above the high end of guidance. Currency was essentially in line with my guidance. And as usual, I’ll be discussing our financials using constant currency growth rates as this is how we manage the business. Total cloud revenue, that’s SaaS and IaaS was up 22% at $5.6 billion, with SaaS revenue of $3.5 billion, up 10% and IaaS revenue of $2.2 billion, up 46%, on top of the 64% growth reported last year. As a reminder, we exited the advertising business last quarter, which had the effect of lowering the total cloud applications revenue by 2% this quarter.
Total cloud services and license support for the quarter was $10.5 billion, up 11%, driven again by our strategic cloud applications, autonomous database and OCI. Application subscription revenues, which includes product support were $4.8 billion and up 7%. Our strategic back-office SaaS applications now have annualized revenues of $8.2 billion and were up 18%. Infrastructure subscription revenues, which includes license support, were $5.8 billion and up 14%. Infrastructure cloud services revenue was up 46% and up 49% when you are excluding our legacy hosting services. Our infrastructure cloud services now have an annualized revenue of $8.6 billion, OCI consumption revenue was up 56% and demand continued to outstrip supply. Cloud database services, which were up 23% and now have annualized revenues of $2.1 billion.
Very importantly, as on-premise databases migrate to the cloud, on OCI directly or through our database at cloud services with Azure, Google and AWS. We expect those cloud database revenues collectively will be the third leg of revenue growth alongside OCI and strategic SaaS. Database subscription revenues, which includes database license support, were up 4%. Software license revenues were up 8% to $870 million, including Java, which saw excellent growth. So all in, total revenue for the quarter were $13.3 billion, up 8% from last year. Shifting to gross profit and operating income. The gross profit dollars of cloud services and license support grew 9% in Q1 as our cloud businesses continue to scale the gross margins of both cloud applications and cloud infrastructure has each been climbing higher.
We continue to display operating expense discipline with Q1 operating income growing 14% and the operating margin was 43%. The non-GAAP tax rate for the quarter was 18.9% with non-GAAP EPS at US$1.39, up 17% in USD, up 18% in constant currency. The GAAP EPS was US$1.03, up 20% in USD and up 22% in constant currency. Including in my guidance at the beginning of the quarter was the expected completion of an assessment of the useful lives of our server networking equipment, including an increase of the estimated lives from five years to six years effective at the beginning of this fiscal year. This change in accounting estimate reduced Q1 operating expenses by about $197 million. At quarter end, we had nearly $11 billion of cash and marketable securities, a short-term deferred revenue balance was $11.5 billion, up 2%.
Operating cash flow for Q1 was $7.4 billion, while free cash flow was $5.1 billion. On a trailing 12-month basis, operating cash flow was $19.1 billion and free cash flow was $11.3 billion. Our remaining performance obligations or RPO is now $99 billion, up 52% in constant currency. Now while we typically see a seasonal decline of RPO in Q1, we signed several large deals this past quarter, resulting in a sequential increase in RPO compared to the decline that we typically see based on our experience over the previous five years. Further, our cloud RPO grew more than 80% and now represents nearly three-fourth of total RPO. And approximately 38% of total RPO is expected to be recognized as revenue over the next 12 months, which reflects the growing trend of customers wanting the larger and longer contracts as they see firsthand how Oracle Cloud services are benefiting their businesses.
We spent $2.3 billion on CapEx this quarter, given the demand that you see in our RPO growth and the additional demand we have and see in our pipeline, I expect the fiscal year 2025 CapEx will be double what it was in fiscal 2024. As always, we remain careful to pace our investments appropriately and in line with booking trends. We now have 85 cloud regions live, another 77 planed with more to follow. We have public cloud regions. We have dedicated cloud customer regions. We have national security regions. We have sovereign regions. We have Oracle alloy regions with our partners, and we have multi-cloud regions with Azure and Google Cloud and now shortly with AWS as well. This sizing flexibility and deployment optionality of our cloud regions continue to be significant advantages for us in the marketplace.
As we’ve said before, we’re committed to returning value to our shareholders through technical innovation, strategic acquisitions, stock repurchases, prudent use of debt and the dividend. This quarter, we repurchased 1.1 million shares for a total of $150 million. In addition, we paid out dividends of $4.4 billion over the last 12 months. And the Board of Directors again declared a quarterly dividend of $0.40 per share. Before I dive into specific Q2 guidance, I’d like to share some overarching thoughts and the benefits that I expect they will bring over the coming years. First, the Oracle database is thriving, and the multi-cloud agreements we now have with Microsoft, Google and AWS make it easier for our customers to run their Oracle databases in the cloud.
Second, we are rapidly expanding our OCI capacity to meet the demand that you see in our 52% RPO cloud growth. Third, while much attention is focused on our GPU related businesses, our non-GPU infrastructure business continues to grow much faster than our competitors. And finally, our strategic SaaS app continue to grow while we are starting to see more and more of our industry-based cloud apps come online. All these trends point to revenue growth going higher. We will discuss the implications of these positive trends at our financial analyst meeting on Thursday. For fiscal 2025, we remain very confident and committed to full year total revenue growth growing double digits and full year total cloud infrastructure revenue growing faster than last year.
Let me now turn to my guidance for Q2, which I’ll review on a non-GAAP basis. If currency exchange rates remain the same as they are now, currency should have about a 1% positive effect on total revenue and as much as a $0.03 positive effect on EPS, hard to know for sure. However, the actual currency impact may be different. Total revenues are expected to grow from 7% to 9% in constant currency and are expected to grow between 8% and 10% in USD at today’s exchange rate. Total cloud revenue is expected to grow from 23% to 25% in constant currency and 24% to 26% in USD. Non-GAAP EPS is expected to grow between 6% to 10% and be between $1.42 and $1.46 in constant currency, and non-GAAP EPS is expected to grow $0.08 to $0.12 and be between US$1.45 and $US1.49 in USD.
My EPS guidance for Q2 assumes a base tax rate of 19%. However, onetime tax events could cause actual tax rates to vary. And with that, sorry, it was so long. And with that, I’ll turn it over to Larry for his comments.
Lawrence Ellison: Thank you, Safra. Today, Oracle has 162 cloud data centers, live and under construction throughout the world. The largest of these data centers is 800 megawatts, and it will contain acres of NVIDIA GP clusters able to train the world’s largest AI models. That’s what’s required to stay competitive in the race to build one, just one of the most powerful artificial neural networks in the world. The stakes are high and the race goes on. Soon Oracle will begin construction of data centers that are more than a gigawatt. Building giant data centers with ultra-high apartments RDMA networks and huge 32,000 node NVIDIA GPU clusters is something that Oracle has proven to be very good at. It’s the reason we’re doing so well in the AI training business.
It’s important to remember that we first developed those high-performance RDMA networks to interconnect our Exadata CPU cluster hardware that powers our Exadata database cloud service. The Oracle Database Cloud service running on Exadata and Exascale RDMA clusters, provide an order of magnitude, better performance, better scalability, better reliability and better security than other databases. And it’s still the world’s only autonomous, fully self-driving database. Our large and loyal customer base understand and appreciate the many technical advantages of using the Oracle database. And those customers wanted us defined a way to make the very latest and best Oracle technology available on other clouds in addition to OCI. We found a way. With today’s AWS announcement, our customers will be able to use Oracle’s latest Exadata and Exascale RDMA clusters with the latest versions of our database software, from within the Microsoft Azure cloud, from within the Google Cloud and from within the AWS cloud.
This will enable customers to use the Oracle database anywhere and everywhere. That has always worked well for our customers and for our database business. We believe our cloud partnerships with AWS and Microsoft and Google will turbocharge the growth of our database business for years to come. Back to you.
Ken Bond: Thank you, Larry. Sarah, if you could please poll the audience for questions.
Q&A Session
Follow Oracle Corp (NYSE:ORCL)
Follow Oracle Corp (NYSE:ORCL)
Operator: Thank you. [Operator Instructions] Your first question comes from the line of John DiFucci with Guggenheim Securities. Your line is open.
John DiFucci: Thank you. Larry and Safra, I mean there’s a lot of good stuff here, but I’d like to ask a question on margins. You keep putting up strong cloud numbers, especially the OCI numbers that look — when you give the guidance and you look at what you have to do to hit them, they look really difficult to do to say the least. We also assume that upside to RPO and you’ve pointed out, Safra, the sequential increase. I think the last time there was a sequential increase was because you bought Cerner and they just added RPO because of that. I assume a big part of that’s OCI too. You mentioned it’s cloud, three quarters of it. So that indicates there’s more to come, right? So given the mix of business continuing to lean into that lower margin OCI, and I know you have — that’s changing over time, but it’s still lower margin today, how should we think of overall margins versus profit for the entire company going forward?
Safra Catz: Okay. Let me start with this and maybe then Larry can add on. So first of all, I want to remind you that, that — remember that third leg of the stool I mentioned, which is our database and autonomous database, that is also part of OCI. And that is beginning to really expand. And our multi-cloud agreements, again, will help OCI gross margins. So that you know, gross margins even this quarter as a percentage increased, regardless of the fact that we have a lot more OCI. And so our business is really only now starting to get real scale. And we have built OCI in a way and Larry can really expand on it, where it is extremely automated. The management of it is very automated. And our whole rollout as it grows, we make more money.
And as much as percentages are great, and again, our operating margin percentages continue to increase. And OCI includes not only base storage and compute and GPUs, but it also includes a lot of other capabilities, including the database, which have excellent margins, too. And of course, as you mentioned, our SaaS business, again, an excellent and at scale business, even that business benefits from our expansion in OCI. And once again, even its high margins continued to improve this past quarter. Larry, I don’t know if you want to…
Lawrence Ellison: I’d like to. So let’s start with SaaS. As we go to autonomous database, we get tremendous efficiencies. We’re moving Fusion and NetSuite to Autonomous database as we speak. We’ve decided everything needs to move to autonomous for two reasons really. First reason, when you have a completely autonomous database, there is no — the DBA, the database administrator is a robot. There is no human labor associated with managing the Oracle Autonomous Database. Now okay, that’s obviously a cost savings. But more importantly, with no human labor, there’s no human error. It’s a huge security advantage we have over our competitors. We don’t — there’s no mistakes to be made. There’s no human labor. It’s all automated. And the potential — when you have everything completely automated, and it’s also truly elastic.
I’m not going to go into exactly what that means. But it means that you’ve got a job running that certainly needs 500 microprocessors, you get those 500 for the 3 minutes you need it, and then you return them to the pool. So that’s very different than how other databases work, which they may call — the cloud itself may be elastic in places, but their databases are typically not elastic. Autonomous is. So we use a lot less hardware. It’s a lot faster. It’s a lot more efficient. It’s fully automated, no human labor, much more secure. And the margins for the autonomous database business is much higher or much higher than the traditional Oracle business. And I think those margins are — I mean, they’re stunningly high. Around the same margins as SaaS which are also stunningly hard margin because SaaS runs primarily on that autonomous database.
So we use hardware very efficiently. We use labor sparingly because labor is a security risk. When people are actually doing things manually, it’s a security risk, and it slows down our ability to expand. Every Oracle data center from the largest to the smallest are identical in features and functions, they only vary by scale. That means we have one suite of automation software that automates all of this. Nobody else does this. No one has that level of automation, that level of autonomy. It allows us to get much better margins in our database business, in our SaaS business and the rest of our cloud business. Our clouds are more automated, so we have very low labor costs. Our networks are much more efficient, the RDMA networks run so much faster.
If you run twice as fast, our costs go down by half. And our networks are much faster than the other clouds. So we think our potential as we scale, our potential to deliver much better margins they’re currently delivering are very real.
John DiFucci: So is it safe for me to assume that then the upside you keep putting up and you hopefully will continue to do, will add profit and that profit will actually increase as a percentage of the margins will also increase over time?
Lawrence Ellison: I believe so. And I believe, for example — I mean I think you’d find different points of view from different engineers as we move Fusion to Autonomous Database. I think the cost savings — our cost — our cloud cost savings will be around 50%. That’s what I believe. Now it might be 40%, it might be 35%, but there will be huge cost savings from where we are now, and that’s across the entire base of Fusion customers. So that’s just one example of how we’re using faster networks, faster databases, more automation to make our products more secure. And I keep emphasizing the security is really the primary goal. But as a second order effect, we also end up spending a lot less money to run those data centers.
John DiFucci: Great. Thank you, Larry. Thank you, Safra.
Operator: Your next question comes from the line of Mark Murphy with JPMorgan. Your line is open.
Mark Murphy: Thank you very much and congrats on the great performance. Larry, how do you envision the market transitioning from the AI training phase to the AI inferencing phase? There’s some debate out there on whether we have an imbalance or a bubble on the front end of the curve because training is compute intensive and then perhaps it recalibrates differently somehow for the inferencing stage, which might be less intensive? Or do you see the potential for high growth kind of all the way through both of these phases?
Lawrence Ellison: Well, a lot of people think that, Mark, I send a kid to college and then I’m done. They’re training ever. I got four years of training, and then I can put the kid to work and they’ll be doing inferencing. And that’s not true. This race goes on forever, to build a better and better neural network. And the cost of that training gets to be astronomical. When I talk about building gigawatt or multi-gigawatt data centers, I mean these AI models, these frontier models are going to — the entry price for a real frontier model from someone who wants to compete in that area is about $100 billion. Let me repeat, around $100 billion. That’s over the next four, five years for anyone who wants to play in that game. That’s a lot of money.
And it doesn’t get easier. So there are not going to be a lot of those. I mean we — this is not the place the list who can actually build one of these frontier models. But in addition to that, there are going to be a lot of very, very specialized models. I can tell you things that I’m personally involved in, which are using computers to look at biopsies of slides or CAT scans to discover cancer. Also, there are also blood tests for discovering cancer. Those tend to be very specialized models. Those tend not necessarily use the foundational the Groks and the ChatGPTs, the Llamas and the Geminis, they tend to be highly specialized models. Trained on image recognition on certain data, I mean, literally millions of biopsy slides, for example, and not much other training data is helpful.
So that goes on, and we’ll see more and more applications like that. So I wouldn’t — if your horizon is over the next five years, maybe even the next 10 years, I wouldn’t worry about, hey, we’ve now trained all the models we need and all we need to do is inferencing. I think this is an ongoing battle for technical supremacy that will be fought by a handful of companies and maybe one nation state over the next five years at least, but probably more like 10. So this business is just growing larger and larger and larger. There’s no slowdown or shift coming.
Mark Murphy: Thank you very much.
Lawrence Ellison: Let me say something that’s going to sound really bizarre. Well, I probably — you’d probably say, well, he says bizarre things all the time. So why is he announcing this one? It must be really bizarre. So we’re in the middle of designing a data center that’s north of the gigawatt that has — but we found the location and the power place. We look at it, they’ve already got building permits for three nuclear reactors. These are the small modular nuclear reactors to power the data center. This is how crazy it’s getting. This is what’s going on.
Operator: Your next question comes from the line of Raimo Lenschow with Barclays. Your line is open.
Raimo Lenschow: Just a question more on the database side on the agreements that you just announced today or that you have in place and now added with AWS. So now that we have all the hyperscaler agreements in place, how do you think about that migration movements from database workloads that are at the moment running on-premise or on cloud customer to the public cloud? I mean how should we think about that momentum? Thank you.
Safra Catz: We think it going to accelerate — no, you go ahead, Larry.
Lawrence Ellison: No, no, no. I think you’re right. Well, there’s two things. Public cloud is very interesting and it’s very important. I mean Oracle became very successful in the database business a long time ago because one of our watch words was portability. We ran on IBM main frames. We ran on Microsoft PCs. We ran on Hewlett Packard machines. And if you remember them, digital equipment machines and all sorts of computers, we ran everywhere. And that was very important so our customers could run the Oracle database in any environment. And it’s obviously — and we had to find a way to actually make the best versions of our database, the Exadata, Exascale versions of our database available in other people’s clouds. And what we’re able to do is basically get OCI small enough that we could embed an OCI data center within Microsoft Azure or an OCI data center within Google or AWS or wherever we had to put it where it could be fully autonomous, where we could use Exadata and Exascale clusters.
We actually were able to do that. It was not technically easy, but we did it. In doing that and miniaturizing our Oracle data centers, I mentioned earlier that all of our data centers are the same except in scale. The biggest one is 800 gigawatts right now, bordering — getting close to 800 megawatts, excuse me, we’re getting close to 1 gigawatt. The smallest are about 150 kilowatts, and we’re going to get down to 50 kilowatts. What that means is, we’ll have a lot of companies, medium large-sized companies that will decide to have an Oracle private cloud. I mean it’s still — there’s no difference between our private cloud and the public cloud. They are identical. They’re absolutely identical. And a bunch of people have Oracle — have a private cloud, a bunch of industrial companies, Vodafone has six Oracle private clouds, for example, to run their workloads.
But they’re becoming so inexpensive that anyone can decide, okay, I want to move to the cloud. I want all the advantages of the cloud, but I want to make sure that I’m the only one in the cloud. I don’t want any neighbors or I want only approved neighbors. I don’t want someone with a credit card moving in. I’m just paranoid about security where I’ve got government regulations I have to adhere to. So we think that obviously, moving with Oracle database available in AWS, Microsoft and Google is incredibly important. And Safra said it right, I mean it will absolutely accelerate database growth in the public cloud. But we expect that private clouds will greatly outnumber public clouds as companies decide they want the Oracle Cloud behind their firewall in their data center with no neighbors.
And we — because we’ve gotten our data center — our data centers are so automated and they’re scalable, and they’re all identical in terms of function, we’re organized. So we can — actually, we have 162 data centers now. I expect we will have 1,000 or 2,000 or more data centers — Oracle data centers around the world, and a lot of them will be dedicated to individual banks or telecommunications companies or technology companies or what have you or nation states, sovereign clouds, all of this other stuff. So we think — it’s hard for me to predict whether the private clouds or the public cloud which is going to be bigger? I don’t know. The good news is we win either way.
Raimo Lenschow: Okay. Interesting. Very interesting.
Lawrence Ellison: Next question, Shara.
Operator: Thank you. Your next question comes from the line of Mark Moerdler with Bernstein. Your line is open.
Mark Moerdler: Thank you so much and congratulations on the quarter. Very impressive both the quarter and the guide. We’ve seen a lot of focus on the model training side, but less on applications and inferencing in the rest. You guys have a lot of expertise in the market and in the industry. You already have traditional AI sprinkled throughout all the Oracle products and capabilities. But where do you see the monetizable value of GenAI on the app side? How long do you think it’s going to take for GenAI to be a meaningful revenue, not just for Oracle, but software in general, on the app side, not on the training side? Thank you.
Lawrence Ellison: Well, let me — I’m sorry, I’m monopolizing these answers. I apologize. But let me start with health care everything from us helping doctors diagnose different diseases. When someone goes in to get a sonogram, and I’ve seen the nurses and the technicians and the doctors actually measure the baby skull and measure the baby’s spinal cord to see how — it’s utterly ridiculous. The computer should do all of that. And if there’s an umbilical wrapped around the fetus the computers should discover all of that, and now it should all be recorded. The doctor could get assistance from a computer doing all of this stuff. Looking at the plaque and coronary arteries, all should be done that way. Already, we’ve delivered when a doctor visits a — gets ready to visit a patient, we prepare a summary for the doctor.
We use AI to look at the electronic health records, the latest labs that might’ve been just a few hours ago. And let the doctor know whether there’s stability or disease progression or whatever the doctor needs to know prior to the consultation with the patient. That summary is created by AI, human readable summary. Then AI listens to the consultations between the doctor and the patient. This is already delivered. This is already out there. They’ll deliver — they’ll listen to the consultation of doctors with the patient. If the doctor orders a prescription, the AI checks to make sure the prescription is accurate and enters the prescription. The AI updates the electronic health records. The AI transcribes and distributes the doctors’ orders, all from listening to the conversation.
The doctor then gets a draft at the end of the conversation that the doctor can quickly review and approve. And then the prescriptions are filled and the orders are executed and the electronic health records are updated. We’re already doing all of that. But I can go on. In health care, we need so many things from reading of X-rays to just the user interface. Our user interface is so different than Epic’s. I was at Stanford with my son at one time and it took three people, three different positions to actually be able to find his X-rays. This is how you find the X-rays for Larry Ellison. You say, Oracle, please show me Larry Ellison’s latest X-ray. It’s a voice interface. You just ask for them. How do you log on? Well, you look at the computer and it recognizes your face.
It recognizes your voice, that knows you’re the doctor and you’re authorized to look at that, all the authorization is done with AI. When are we going to start monetizing it? Well, all of Cerner is the monetization. The fact that we can dramatically expand our health business is because it’s based on AI. It is — AI is just — I don’t know how to describe it. I mean the best way to describe it. It’s not something you sell separately. It’s the diagnostic system. It’s the electronic health record system. It is — the pharmacy system, this prescription system, the user authentication, the log-in system, it’s all AI. And I know people think it’s a separate thing that, “Oh my God.” And I hear a bunch of applications come say, “Oh, we’ve now got AI agents we’ll charge for separately.
I mean — it’s — our applications are going to be primarily AI applications, everything. How do you charge separately for everything I really don’t — I find it bewildering when I listen to them talk. I don’t understand what they’re saying. We wonder — and I’ll stop there.
Mark Moerdler: Larry, I think you are the first person to explain it that way. Thank you. It makes a lot of sense.
Operator: Thank you. Your next question comes from the line of Derrick Wood with TD Cowen. Your line is open.
Derrick Wood: Great. Thanks. And I will echo my Congratulations. Safra, Larry, you guys have had this big inflection in RPO growth over the last few quarters. Could you update us on how you’re feeling about supply availability and your ability to stand up data center infrastructure in a time-efficient manner in order to move from contract signing to consumption and convert backlog into revenue? And I guess, are you doing anything different today than, say, a year ago to try and help accelerate these time lines?
Safra Catz: Okay. I’m going to start and Larry can finish. So we have enormous demand that is absolutely true. And I will say that demand is still outstripping supply. But I can live with that because we are laying out a lot of supply. As you can see, in the revenues and you’re going to — and you can see that in the guidance and my commitment for this next year, and we’ll be talking about beyond on Thursday, Financial Analyst Day. We have made a number of changes, including what Larry mentioned earlier, which is the automation of the setting up and laying down our data centers. However, there are — because demand is so large, we do have to get in different places. As I told you I don’t know, a couple of quarters to grow. We made the decision instead of picking up small pieces to actually wait in some cases, to pick up larger locations.
And that is really playing out very, very well for us. But we are really moving and growing in so many places because it’s not only public cloud rollouts, but it is also private clouds, which are in immense demand, national security regions, immense demand. And really, we are moving as fast as we can, and automation is the thing that has helped us lay this out. And the fact that it is — we have an everything everywhere means, nothing is unique. Everything is the same, and that cannot be set by our competitors, and that helps us in our rollout.
Lawrence Ellison: I just want to emphasize what Safra just said. Our private clouds are identical to our public clouds, except for the fact they might only have one tenant, and it might be in a building that you own. Besides that, they’re identical. We own the hardware. We manage the hardware for you. It just happens to be in a building you own, and you’re the only one that can get in. So that’s a very different situation than all of our competitors, and it’s fully automated. So we’re prepared to manage thousands of data centers. By the way, I would compare that to Elon Musk, Starlink, where he’s got, I think, close to 7,000 satellites in the sky now, 6,800. How do you manage — these satellites constantly maneuver. They don’t — they’re not geosynchronous satellites.
Their low earth orbiting satellites. So they’re constantly flying around and changing location. How do you manage 7,000 spacecraft flying around? Well, let me tell you, computers, it’s got to be fully automated or it’s not going to work. You can’t have thousands even hundreds, I would say, data centers you can’t have, but you certainly can have thousands of data centers unless they’re fully automated. And the only way you could automate something is to make them all the same. You can’t automate 25 different things. So that’s one thing. The other thing I’ll point out, and I think it’s interesting about Oracle, that some of the most senior people in our management team are experts in building buildings, building electric power plants and electric transmission systems.
Because building these data centers is just that. You can’t just build a data center. You also have to account for the energy and the transmission of the energy from where it’s generated to the data center. And of course, the most efficient way to do this is actually build the generation — the power generation plant right next to the data center. So you transmit over the short — over the shortest distance. And we actually have very senior people who are very — actually come from the utilities industry, as strange as that sounds, that are expert in doing this and helping us build these gigantic projects. Again, I’ll harken to Elon Musk. One of the hardest jobs he had in building Tesla was when he built the Austin plant, he had to build the largest building ever been built by human kind anywhere at any time.
And you want to know the largest building ever built? It certainly is not the Pentagon. It’s not the NASA building for the space shuttle. Largest building is the Tesla plant. And so you have to be the contractor of that plant. You have to be able to build those things and then fill it with robots that then build your cars. So you got to build the building, get the power, build all the automation systems, which is the hardest part about building the cloud or building automation system, build all the automation systems, so it works efficiently and reliably and cost effectively. That is — and we have some really interesting people here with a very different experience base than we had even five years ago.
Q – Derrick Wood: Thank you very much.
Operator: Your final question comes from the line of Brad Zelnick of Deutsche Bank. Your line is open.
Brad Zelnick: Great. Thanks so much for fitting me in. And I’ll just start by saying I can’t remember a Q1 ever being this exciting. Larry, we’ve talked about a lot of the reasons why you win in cloud infrastructure, especially by addressing areas of the market where your competitors can’t even reach. But in light of many high-profile cyber incidents lately, can you talk about how being more secure, not just in being so highly automated as you’ve already discussed, but how is being more secure helping to win some of these very large OCI deals especially in the U.S. and other governments around the world? Thank you.
Lawrence Ellison: Well, I couldn’t thank you more for the question because I have two things I’m talking about a Cloud World on Tuesday. One is multi-cloud and the other is security. And let me announce right here, we’re done with passwords. The idea is utterly ridiculous. They’re easily hacked. The more difficult they are to remember, the more likely you are to write them down, the more likely they are to be stolen. Everything done to make passwords better has made them worse. It’s a terrible idea. So we’re getting rid of passwords entirely. This is the way log on is going to work. I’m going to type in larry.ellisonoracle.com, the computer is going to look at me and say, okay, hi, Larry, we’re done. Why would I type in? Safra can recognize me.
My kids can recognize me. You’re telling me a computer can’t recognize me and log me in. This is ridiculous. So we’re not — no more passwords. Those have got to go. There are other things we can do to secure network communications and we have this new technology actually goes live in our cloud this week called Zipper, it’s called Zero Trust Packet Routing, that I’ll be talking about. And it guarantees the — while biometric authentication will guarantee that you are who you say you are when you log in. So we’ll know, so users — fraudulent users will find it very difficult to infiltrate our systems with biometric authentication. And by the way, there are keys on the keyboard that will look at your fingerprint. It depends on how — if you want face recognition plus fingerprints, all voluntary.
You don’t have to do it if you don’t want to. But I certainly — a whole bunch of people prefer biometric authentication. That’s why Google Pay is popular and Apple Pay is popular. So it’s from my convenience. I don’t want to remember passwords. Look at me and recognize me and log me in. Don’t ask me to type in some stupid 17 letter password that someone can steal. There is that. The Zero Trust packet routing actually authenticates you from the user all the way to the data. And we’ve greatly simplified network security by separating it from network configuration. That’s another thing I’m going to talk about. But by the way, let me go back to automation. Automation is more of a security issue than it is an efficiency issue. Automated cars, self-driving cars will kill a lot fewer people every year than human drivers.
They don’t get drunk. They don’t go 135 miles an hour. Automation is a safety issue and a security issue. All — almost all cyberattacks begin the same way with human error. If there is no Oracle database administrator as there isn’t with the autonomous database, the data — the DBA cannot make a configuration mistake that exposes your data. You can’t have a human being involved if you want to make your data secure. Autonomy is a huge portion of it. And the fourth piece of security that I’ll talk about is code generation. When you generate a computer program rather than hand write it in Java, hey, we’re the owners of Java or I don’t know. Maybe this supreme court said, we’re not the owners of Java. I’m not sure. I’m not sure what it even means.
I’m not a lawyer. But if the computer generates the program rather than a human being writing it, the computer will not generate security vulnerabilities. They will not generate something called state which means your application can’t fail over automatically in case your data center loses power or burns down or something like that. Code generation is a huge portion of next-generation security. Zero Trust packet routing is another important one, biometric authentication and automation are the four key attributes to winning the cyber war. It’s going to be our defensive robots against their robots, and we have to have better technology on the defensive side than they currently have on the offensive side because they are winning. I mean every year, there are more — there’s more successful cyberattacks.
And you know what the FBI says when you’re attacked? Pay the ransom, there’s something we can do about it. Well, that’s not very encouraging. I have an idea. Let’s keep the ransom money, but implement next-generation technology that makes it much harder for cyber criminals. And by the way — and it’s not so bad with cyber criminals now compared to what state actors have the ability to do. So we have to harden our computer systems. We have to make them more secure. The good news is there is a whole new generation of AI-based security systems like biometric authentication, like zero-trust packet routing that we can use to stop these attacks, but we have to actually deploy the technology.
Brad Zelnick: Amazing, Larry. Look forward to learning more this week in Vegas.
Ken Bond: Thank you, Brad. Thank you, Larry. A telephonic replay of this conference call will be available for 24 hours on our Investor Relations website. Thank you for joining us today. And with that, I’ll turn the call back to Sarah for closing.
Operator: Thank you. This concludes today’s conference call. We thank you for joining. You may now disconnect your lines.