Operator: The next question comes from Erik Rasmussen with Stifel. Please go ahead.
Erik Rasmussen: Thanks for taking the question. Maybe just on AI. It seems most of the demand in the early stages is expected to come from training versus inferencing. But we heard from Microsoft on their earnings call, and they said that most of the AI strength that they were seeing for Azure was, in fact, from inferencing workloads. Are you seeing the same as it relates to sort of the demand patterns from your customers? And then maybe any way to quantify how much AI has contributed or the size of the opportunity. Thanks.
Andrew Power: Thanks, Erik. Let me just touch on this a little bit and then hand it to Chris to walk you through chapter and verse. I look at where our heritage is as a company, this AI opportunity is tremendously in our wheelhouse. We came with this from a hyperscale piece of the business and build the colo connectivity capabilities organically and inorganically. We are often taking larger halls that are at higher power densities and using our engineering prowess to work for enterprise customers and pushing their boundaries on power densities. I can tell you, AI workloads were at digital before I joined over nine years ago. We’ve done retrofits on existing deployments just to fit up for customers needing AI just this year. At the end of last year, I was to one of our Paris facilities.
We’re one of our partner customers have fitted out of liquid cooling for a multinational financial services company, live environments which meant that they had to get going on that years ago to be live at the end of last year. So this is right in our wheelhouse, and we’ve been winning in that category in the last year in the, call it, several hundred kilowatt domain to the 30-plus megawatt piece of this, and you’ve seen some testimonials on that. Chris, why don’t you give a little bit of color as to some of the verticals we’ve been winning and where do you see this going as well?
Christopher Sharp: Yes. It’s a great question and appreciate it, Eric. I think there’s a couple of dynamics that you touched on. The training to inference. That’s something that we’ve been watching for some time. And we’re selective with some of the training environments just because we’re looking for a long-term durable workload being deployed into the asset. And so a lot of the customers we see today are actually doing training or inference inside of their training just because of sheer availability of the GPUs and time to market, but we definitely see the long tail of that value happening in inference and then also kind of another section of private AI and so we’re seeing customers come to market with these types of requirements where Andy alluded to this at a high level coming from our heritage of scale and then being able to evolve and support a colo type of capability, if you will, allows us to support a higher power density need with our versatile designs.
And so being highly focused on that with a lot of the hyperscale customers and being foundational for their cloud services has been top of mind. And then I think another element that may be overlooked is a lot of this inference is embedded in a lot of our top customers today as capabilities. So when you read about, and I think you referenced Microsoft and the work that they’ve been doing with CoPilot, a lot of that AI is embedded in enhancing their current product capabilities. And so we’re constantly watching how that evolves, but being proximate to the AI being proximate to the existing infrastructure and workload is absolutely something that we’re very excited about because that inference benefits from the data oceans that exist within Digital Realty, their current infrastructure and how all that culminates together.
So it’s something you’ll see play out over this year. I think we just had a really good case study with KakaoBank where that’s highlighted private AI deployment where they’re able to generate and do a little bit of R&D around new product offerings for a financial vertical. So these are the things you’re going to see play out in 2024 that we’re very excited about.
Operator: The next question comes from Nick Del Deo with MoffettNathanson. Please go ahead.
Nick Del Deo: Hey, thanks for squeezing me in. I was wondering if you could expand a little bit on the assumptions you’re making about the broader lease environment and the pricing environment to get to the 4% to 6% range for cash renewal spreads. Are you just basically taking current prices as a given? Or are you kind of assuming further increases or other changes? And can you share anything about how the expected renewal spreads look for zero to one versus greater one categories? Thanks.
Andrew Power: Sure, Matt. Why don’t you take that?