Thoughtworks Holding, Inc. (NASDAQ:TWKS) Q1 2024 Earnings Call Transcript

Page 3 of 3

And then at the same time, I mentioned earlier, we continue to invest in this vertical based approach. And then we’re focused on the verticals that we believe are more resilient in this current environment. For example, public energy, healthcare, automotive and life science. And we’re seeing that paying off as well. A large number of our new logos come from these verticals. And finally, we are expanding our service offerings, especially in efficiency and cost saving programs catering for the short term, and then – but also the strategic nature of some of the clients are asking for. Well, I mentioned DAMO, the digital application management operation services. Now approximately 30% of our top 50 clients are using this service. It was obviously zero a few years ago.

We are continuing to expanding this, again this gives us the opportunity to increase our total addressable market, even with the same client portfolio we have.

Arvind Ramani: That’s really helpful. And that explains it. And then one of the earlier questions, I think what you were saying is still a lot of the work around GenAI is proof-of-concept and pilot. So fairly small in scale. I just, the one thing I want to clarify, is there even a single large scale, kind of GenAI project? You’ve seen, at Thoughtworks or even at your clients, and maybe Thoughtworks is not involved. Have you seen any large scale GenAI project? And if you haven’t, when it does come together, what kind of project would it be like? Where would you see, even if you look at the proof-of-concepts you’ve seen, where do you see this promise, where they could be like a much, much larger scale GenAI project?

Guo Xiao: It’s a great question. So we do have a few fairly large GenAI related projects. And then to peel down the layers, what tends to happen, some of the POCs is just using the, some of the open cloud-based platforms to use the existing large language model, and then use that to do simple – perform a simple tasks like content generation and some basic analysis. And then you move to another layer up. There’s the RAG base, which is very popular right now, the RAG based approach, that takes a bit more work to do before you feed any prompts into the larger language model. So the work is around the large language model, but the larger language model itself is still the commercial ones that’s readily available. And then when we – level layer down, which is fine-tuning.

We have certain project that’s doing the fine-tuning of large language model, and that requires more effort, more data, computing power, and then that’s a little bit more heavy lifting from a cost perspective. The largest programs we’ve seen so far is the final phase, that is developing your own large language model. And then that’s fewer definitely, than the others, but we already have them in flight. And we do believe that as we progress as the POC and adoption of GenAI progress, we’re going to see more and more of this work, moving from what I mentioned earlier from left to right, slowly proceeding to finally building your own large language model. Now, you don’t need to build your own LLM in every cases, but some of them will require that, and that’s a significant effort.

And then those will generate much bigger projects.

Arvind Ramani: Perfect. That’s really helpful. Thank you very much. And once again, it was really a pleasure working with you.

Guo Xiao: Thank you, Arvind. Same here.

Erin Cummins: I’m just going to jump in quickly. While we’re still on the call, I wanted to quickly follow up on the share count dynamics that Jacob asked about earlier. I would simply highlight that the share count dynamics reflect the share price and how it impacts the accounting treatment with respect to dilution. So that’s a big part of the change from last quarter to this quarter. An example would be dilution from options. And with that, I’ll hand over to you, Operator.

Operator: Thank you. There are no further questions. I’d like to turn the call over to Xiao for closing remarks.

Guo Xiao: Thank you. And thank you everyone for joining us today for our Q1 earnings call. I want to thank all Thoughtworkers, clients and partners for the extraordinary impact we’re delivering every day together. Stay well and all the best.

Operator: Thank you for your participation. This does conclude the program, and you may now disconnect. Everyone, have a great day.

Follow Thoughtworks Holding Inc. (NASDAQ:TWKS)

Page 3 of 3