Mark Shmulik: Yes, thanks for taking the question. So it looks like time spent on content globally was off, I think led by
Operator: My apologies, Mark has dropped. The next question comes from Ross Sandler with Barclays. Please proceed.
Ross Sandler: Hey. I’ve got a high-level technology question. You guys have been stating forever that you’re a camera company. And we’ve seen an explosion in these new generative AI tools. How do you see those impacting your business? We have examples of Midjourney inside of Discord driving up engagement for them. Do you see the same kind of opportunity inside of Snap? Or do you view this as possibly a risk if people are going to the camera less? How do you see this impacting Snap over the next like, five years? Thanks a lot.
Evan Spiegel: Thanks, Ross. We’re so excited about the opportunity around generative AI, it’s a huge opportunity for us. And we’re already investing a ton. A lot of our most sophisticated AR lenses use generative AI technology. And we also see a lot of opportunities just to make our camera more powerful with generative AI. I mean, some simple examples are like improving the resolution and clarity of a Snap after you capture it, or even much more extreme transformations or editing images or Snaps based on a text input. But, if we think longer term, five years, as you mentioned, this is going to be critical to the growth of augmented reality. So, today, if you look at AR, there’s just a real limitation on what you can build an AR, because there’s a limited number of 3D models that have been created by artists.
And we can use generative AI to help build more of these 3D models very quickly, which can really unlock the full potential of AR and help people make their imagination real in the world. You can imagine playing around with your kids wearing AR glasses and pointing. oh my gosh, there’s a pirate ship and a big monster. And we can bring those to life using generative art, which I think is really exciting. And then, of course, we’re also thinking about how to integrate those tools into Lens Studio. We saw a lot of success, integrating Snap ML tools into Lens Studio. And it’s really enabled creators to build some incredible things. We now have 300,000 craters who built more than 3 million lenses in Lens Studio. So the democratization of these tools, I think, will also be very powerful.
Operator: Thank you. The next question is from Richard Greenfield with LightShed Partners.
Richard Greenfield: Hi. Thanks for taking the questions. One, I guess I want to just follow back on the initial topic of this, DR was up 4% in Q4, which is actually a pretty encouraging number. But it sounds like you’re talking about in Q1 based on your guidance. The changes you’re making are going to drive that DR to go from, up forward to down something. Can you just help us better understand, as a DR flywheel and your investment start to kick in, why is it not driving sort of accelerated spend? Why is it actually hurting? I know you’ve heard a little bit about in the letter, but I think there’s a lot of confusion of sort of why that inflection to the negative in DR as your sort of investing to improve it. And then, just on engagement, you made a comment that global time spent was up on content.
Was that true in the U.S.? Or was that more of a global comment? And then on friends’ stories, you said it was down in Q4? Is that TikTok reels, Shorts, just any color on what’s driving sort of the pressure on friends’ stories in Q4 would be great.