And some of them are also taking that next step and actually doing distribution using our cloud storage offering. So we see it as an evolution for them. The first most basic level is the backups and archives. Then it’s a more active archive use case where they archive the data, but then they’re pulling from it. Then it’s actually developing the media production, leveraging our workflows and then finally doing distribution. And so in terms of where we’re finding them, a lot of them continue to come to us through the same methods that we attract all of our customers, which is a combination of content and community. The blogs that we publish, which has millions of readers around it, a lot of those also end up being media, entertainment customers.
But we’ve also layered on marketing motions, including events. So, we have a — we present at media events such as NAD in Las Vegas. We’ve been at other media events in New York and Amsterdam and other places. So that layers on top of that. And then the channel effort that we’ve been investing in is also a good partner for us where they have relationships with media companies. They find out that they need help, they want to transition off of LPO, off of on-premise systems or sometimes off of the traditional legacy clouds. And they bring those deals to us, and then we work with them to support those customers.
Jason Ader : And then just second quick question for you Gleb, is when you talk about the Backblaze or B2 being used for some of the AI related use cases, I guess I always thought that AI required flash storage just because of the need for speed especially on the inferencing side. So can you just talk to whether it’s just kind of a misperception on my part?
Gleb Budman: And obviously, AI has a lot of different aspects of it and we’ve published including on our blog, we published some of the workflows that companies use with AI. There’s the development of the training models, which starts with a lot of data. And then the workflow is to run multiple, multiple iterations at high speeds on that data to build the model. That generally requires very high performance storage, closely located to the compute. That’s not the optimal use case for us at least up today. But a lot of the other workflows are actually really good fits. And so the workflow types that we’ve seen customers follow is where they will often upload information. Sometimes it’s from cameras, sometimes it’s from existing assets they have, sometimes it’s from systems that are generating data.
That data flows into Backblaze, it’s stored in B2. Then they use our combination of free egress and partnerships with other GPU clouds to send the data to those locations for processing. And then that processed data then gets put back into Backblaze B2 for a combination of one being served up as part of the application itself to customers, and two, for longer term retention around backups and archives that AI data. We also have some customers that use it as the original place where they store the data before it gets used for model training. So there are some use cases for which object storage in the cloud is a great fit and some for which it’s not, but it’s a tremendous amount of data that is being generated out there for AI. We think that we’re — we think AI is still definitely in the early innings of the opportunities that we all have to help there.
Operator: The next question comes from Simon Leopold with Raymond James.
Unidentified Analyst : This is Victor Chuan [Ph] for Simon Leopold. I just want to follow up on that AI question. Did you guys say that you have some AI kind of specific functions and initiatives kind of in the works? Or would you just mentioning that the use cases currently for AI? I’m just want to elaborate on specifically your exposure to AI and kind of what your AI solutions kind of focus features are?
Gleb Budman : This is Gleb. The short answer is yes to both. If you look back at some of what we did in the last year, the first part of it is making sure that we have a durable, high performance available storage platform that’s affordable is a key component of providing value to customers who want to service their AI data needs. Fundamentally, making sure that we’re providing them a top tier storage platform for all of that AI data that is — that needs to be sourced somewhere, needs to be delivered to other locations and needs to be delivered to customers is key. We’ve been investing behind that platform, and in Q4 we built or launched shards stash, which was the higher performance way of for B2 to work, which is a helpful piece of continuing to add value to that platform.
So that’s kind of step one. Step two, is making sure that we’re supporting our customers in understanding how to run these different workflows and how to use object cloud storage as part of that. And so we’ve been working on like as I mentioned on our blog, there are stories and case studies around the different workflows and how to use them and understanding that landscape better. The third thing is partnerships. We mentioned that we partnered with CoreWeave, we partnered with Vulture [ph]. We have these GPU clouds that we partnered with and we make it easy for customers to move their data between us and them as part of this open cloud ecosystem supporting their AI usage. And then the last part of it is that, as I mentioned, our head of sales, once we hire a new head of sales is going to be focusing his entire area of focus on our AI initiatives, which includes both the go-to-market thing some of which I talked about, but also the product platform side.