Innodata Inc. (NASDAQ:INOD) Q4 2023 Earnings Call Transcript February 22, 2024
Innodata Inc. isn’t one of the 30 most popular stocks among hedge funds at the end of the third quarter (see the details here).
Operator: Greetings. Welcome to Innodata’s Fourth Quarter and Fiscal Year 2023 Earnings Call. At this time, all participants are in a listen-only mode. A question-and-answer session will follow the formal presentation. [Operator Instructions] Please note, this conference is being recorded. I will now turn the conference over to your host, Amy Agress. You may begin.
Amy Agress: Thank you, John. Good afternoon, everyone. Thank you for joining us today. Our speakers today are Jack Abuhoff, CEO of Innodata; and Marissa Espineli, Interim CFO. We’ll hear from Jack first, who will provide perspective about the business, and then Marissa will follow with a review of our results for the fourth quarter and the 12 months ended December 31, 2023. We’ll then take your questions. Before we get started, I’d like to remind everyone that during this call, we will be making forward-looking statements, which are predictions, projections or other statements about future events. These statements are based on current expectations, assumptions and estimates, and are subject to risks and uncertainties. Actual results could differ materially from those contemplated by these forward-looking statements.
Factors that could cause these results to differ materially are set forth in today’s earnings press release in the Risk Factor section of our Form 10-K, Form 10-Q and other reports and filings with the Securities and Exchange Commission. We undertake no obligation to update forward-looking information. In addition, during this call, we may discuss certain non-GAAP financial measures. In our SEC filings, which are posted on our website, you will find additional disclosures regarding these non-GAAP financial measures, including reconciliations of these measures with comparable GAAP measures. Thank you. I’ll now turn the call over to Jack.
Jack Abuhoff: Good afternoon, everybody. We’re very excited to be here with you today, as we have a lot of good news to share. We are pleased to announce fourth quarter 2023 revenues of $26.1 million, representing 35% year-over-year growth and 18% sequential growth. We exceeded our guidance of $24.5 million by 6.5% as a result of strong customer demand for generative AI services and our ability to ramp up quickly to meet customer demand. In 2023 overall, we grew revenues 10%. Now, it’s worth noting that our Q4 2023 year-over-year revenue growth was 39% versus 35%, and our year-over-year revenue growth was 23% versus 10%. If we back out revenue from the large social media company, that went through a highly publicized take private in 2022, in conjunction with which it terminated our services, as well as services from many of its other vendors and laid off 80% of its staff.
This customer contributed $8.5 million in revenue in 2022 and $0.5 million in revenue in Q4 of 2022. Beginning in Q1 2024, revenue from this customer will no longer provide a drag on year-over-year comparisons. We are also very pleased to announce fourth quarter adjusted EBITDA of $4.3 million, exceeding our guidance of $3.7 million by 16%. Growth in Q4 was driven primarily by ramp of generative AI development work for one of the Big Five tech companies we signed mid-2023 and also benefited by the start of the generative AI development program with another of the Big Tech customers we announced late last summer. In late Q4, the first customer I mentioned signed a three-year deal with us for our current, initial program, with an approximate value of $23 million per year for each of 2024, 2025 and 2026, or $69 million for the three years, based on the not-to-exceed value of the statement of work.
We’re very proud of this achievement. It came with customer kudos for the work we’ve done and expressions of interest in expanding the partnership further. That said, and as a cautionary note, investors should understand that there are a number of ways under the SOW that the customer could terminate early or reduce spend if it chose to. We believe the quality of our services will always be the key to enduring customer relationships, not the stated value or term of a contract. We’re off to a strong start to 2024. We entered the year with master service agreements in place with five of the so-called Magnificent Seven technology companies. With two of these companies, we are now solidly underway. A third also contributed to Q4 growth, with a more significant ramp-up from this customer starting this month.
We are optimistic we will grow revenues with all three of these customers in 2024. With the remaining two of the five Mag Seven customers, we’ve barely gotten out of the gate, but we are optimistic about making significant inroads this year. We are also in conversations with several additional companies, including some of the most prominent leaders in generative AI today. We believe we have the strategy, business momentum and customer relationships to deliver significant revenue growth in 2024. We will stick to our annual growth target of 20% in 2024 with the intention of over-achieving this. In 2024, we will target two broad markets. The first is Big Tech companies that are building generative AI foundation models and we believe are likely to spend significantly on generative AI development.
For these Big Tech companies, we provide a range of services they require to support their gen AI programs. One of these services is the creation of instruction data sets. You can think of instruction data sets as the programming used to fine tune large language models. Fine tuning with instruction data sets is what enables the models to understand prompts, to accept instruction, to converse, to apparently reason and to perform the myriad of incredible feats that many of us have now experienced. We will also be providing reinforcement learning and reward modeling, services which are critical to provide the guardrails against toxic, bias and harmful responses. In addition, we are also involved in model assessment and benchmarking, helping ensure that models meet performance, risk and emerging regulatory requirements.
Based on my conversations with several of these companies, as well as public remarks they have made, we believe they are likely to spend hundreds of millions of dollars each year on these services. This spend is separate from and in addition to their spend on data science and compute, the other essential ingredient of high-performing large language models. Our second target market is enterprises across a wide range of verticals that seek to integrate and fine-tune generative AI models. These are still early days in terms of enterprise adoption of generative AI, but we believe that a decade from now virtually all successful businesses will have adopted generative AI technologies into their products and operations. For enterprises, our offerings including business process management, in which we re-engineer workflows with AI and LLMs and perform the work as ongoing managed services.
We also offer strategic technology consulting, where we work with customers to define roadmaps for AI and LLM integration into both operations and products and build prototypes and proofs-of-concept. We also fine-tune models, both in isolation and as part of larger systems that incorporate other technologies. For enterprises, we are capable of going soup-to-nuts, everything from initial consulting to model selection to fine-tuning, deployment and integration, as well as testing and evaluations to ensure that the LLMs are helpful, honest and harmless. Also for enterprises, we offer subscription-based platforms and industry solutions that encapsulate AI, both our own models and leading third-party models. Much the way data is at the heart of the programming-like work we do for Big Tech, data is similarly critical to enterprise deployments.
Enterprise use cases tend to be highly specific and targeted, requiring models that are trained with industry-specific or domain-specific data or that require significant prompt engineering efforts and in-context learning utilizing carefully curated and organized company data. The bottomline here is that data engineering is important for the Big Tech companies building generative AI foundation models and the enterprises adopting these technologies. Data engineering has been our focus for the past two decades and we believe we are quite good at it. I am going to take few minutes now to respond to some questions I’ve been asked by investors recently. Number one, several investors have asked whether we currently anticipate needing to raise additional equity.
The answer is no, we do not currently anticipate needing to raise additional equity. We ended Q4 with $13.8 million in cash and short-term investments, slightly down from $14.8 million last quarter, but that was largely due to timing, as we had $2.5 — $2.4 million in cash receipts from major customers collected right after the New Year and we generated over $4 million of adjusted EBITDA in Q4 alone. Nonetheless, to support our growth and future working capital requirements, we have a revolving line of credit with Wells Fargo that provides up to $10 million of financing, 100% of which was available under our borrowing base as of the end of Q4. We have not yet drawn down on the Wells Fargo line. We anticipate generating enough cash from operations in 2024 to fund our capital needs without having to draw down on the Wells Fargo facility.
Number two, several investors have asked why we have no Chief Financial Officer. Well in a sense we actually have four Chief Financial, excuse me, Chief Technology Officers or at least their equivalents, each of which manage a specific technology area; we have a PhD in computer science and AI who heads our AI labs research team and data science teams; we have an SVP of engineering overseeing product and platform engineering; we have another VP focused on software development and product evolution for our Agility product; and we have a Chief Information Security Officer who heads security and infrastructure. Under these leaders, we have close to 300 developers, architects, infrastructure managers and data scientists. We have found that this structure best supports the breadth and scale of our business.
Investors have asked us to share our recent spending on software and product development and have asked why do we not separately disclose it to comment on whether we have a significant spend on cloud infrastructure. So, there are three separate questions there and I’ll address each. In terms of our spending across software and product development, over the last five years, we spent about $26 million. This peaked in 2022 at $8.9 million and came down to $6.4 million in 2023. However, since roughly 80% of our business is managed services, we do not view the aggregate spending across these areas as a focal point for investors. In terms of cloud, we spend a couple of million dollars per year, mostly for software, infrastructure and data hosting.
It is our Big Tech customers, not us, that spend massively on GPUs for training foundation models. Other investors have asked us how they should think about our comps. Specifically, they asked whether our comps are companies like OpenAI, Google and Meta, and whether they should compare our R&D spend and cloud compute spend to these companies. These companies are absolutely not our comps. Rather many of these companies constitute part of our target market. We are not in their business, and to state the obvious, we are not of similar scale. Players in this market are building foundation models and we are providing services to this market that help them on their journey. Therefore, we do not believe that comparing our R&D spend and cloud compute spend to theirs is especially useful.
We view our competition as companies focused on AI data engineering services to this market, like ScaleAI and others, and companies more broadly focused on technology services but also focused on AI data engineering, like Accenture and Cognizant. Another question I’ve gotten is how do we manage to pivot to AI without having to raise substantial capital? There are essentially three reasons we were able to pivot to AI without having to raise capital. The first reason, which we believe is by far the most important, is that the massive spend we read about being required to build foundation models is incurred by our large tech customers, not by us. Our customers are deploying extensive amounts of capital for cloud compute, for data science and for data engineering, three crucial ingredients to an LLM, if you will.
We provide the kinds of data engineering services they need and providing data engineering does not require that we separately incur compute costs. The second reason we were able to transition to AI data engineering without incurring massive upfront costs is that we have been a data engineering company for over 20 years and we were able to repurpose a lot of what we already had in place, including management, resources, facilities and technologies, to serve the AI use cases. The third reason is that when we began exploring AI back in 2016 and developing our Goldengate infrastructure we incurred manageable investment. From a data perspective, because we were already employing large teams of resources doing customer work, we did not have to incur incremental additional costs for humans-in-the-loop.
We simply had to re-architect our operator workbenches and to create the right data lakes. The objectives we initially set for the models we built were to enable us to reduce costs associated with maintaining rules-based data processing technologies. We were not seeking to automate the work of humans, but to augment it. Over the years, Goldengate, one of our proprietary platforms, became, we believe, state-of-the-art at things like entity extraction, data categorization and document zoning, all important aspects of what we do. We use the technology in customer deployments and within our own platforms yields great results. That said, Goldengate is not ChatGPT, you can’t converse with it or ask it to writing poetry. Goldengate has 50 million parameters, while ChatGPT is reputed to have 1.7 trillion parameters.
Nevertheless, Goldengate demonstrates that AI can be trained to perform specific tasks very well without incurring massive spending, that AI deployments leveraging open source algorithms and models can be within reach for many enterprises for industry-specific datasets; and that for business implementations especially, data engineering is more important than sheer model size as a predictor of performance. The question I got recently is, how does revenue per employee compare in your different lines of business? The answer is that revenue per employee is lowest in our managed services business, while it is multiple times higher in our AI data engineering scaled services. Regardless, we target an adjusted gross margin of 35% to 37% across these two business lines and we believe gross margin is the better metric to track.
In our software business, our targeted gross margin is anticipated to be about 73% this year and we intend to target a consolidated adjusted gross margin of between 40% and 43%. The final question I’ve gotten several times recently and that I want to respond to on today’s call is, is Agility now profitable? The answer is yes. In this quarter, Agility posted adjusted EBITDA of $1.2 million. This was a 69% sequential increase over Q3. We think we executed the Agility business very well in 2023, growing at 15% in a difficult macro environment. It had a strong adjusted gross margin of 69% over 2023 as a whole and 74% in Q4. We also love what we’ve done with the product, we believe we’ve taken a leadership position as the first end-to-end public relations and media intelligence platform to integrate generative AI.
I’ll now turn the call over to Marissa to go through the numbers and then we’ll open the line for some questions.
Marissa Espineli: Thank you, Jack. Good afternoon, everyone. Allow me to recap our fourth quarter and fiscal year 2023 results. Revenue for the quarter ended December 31, 2023 was $26.1 million, up 35% from revenue of $19.4 million in the same period last year. The comparative period included $0.5 million in revenue from the large social media company that underwent a significant management change in the second half of last year, as a result of which it dramatically pulled back spending across the Board. There was no revenue from this company in the three months ended December 31, 2023. Net income for the quarter ended December 31, 2023 was $1.7 million or $0.06 per basic share and $0.05 per diluted share, compared to a net loss of $2 million or $0.07 per basic and diluted share, in the same period last year.
Total revenue for the year ended December 31, 2023 was $86.8 million, up 10% from revenue of $79 million in 2022. The comparative period included $8.5 million in revenue from the large social media company referenced above. There was no revenue from this company in 2023. Net loss for the year ended December 31, 2023 was $0.9 million or $0.03 per basic and diluted share, compared to net loss of $12 million or $0.44 per basic and diluted share in 2022. Adjusted EBITDA was $4.3 million in the fourth quarter of 2023, compared to adjusted EBITDA of $0.2 million in the same period last year. Adjusted EBITDA was $9.9 million for the year ended December 31, 2023, compared to adjusted EBITDA loss of $3.3 million in 2022. Cash, cash equivalents and short-term investments were $13.8 million at December 31, 2023 and $10.3 million at December 31, 2022.
Now, before I turn to answer questions, like Jack, I also have gotten some questions from investors recently that I promised to respond to on today’s call. The first question was about why we keep cash overseas. The reason we keep cash overseas is to cover operating expenses in these locations. We do not plan to repatriate these funds nor we — nor do we foresee the need to. Further, another question was about cost-plus transfer pricing agreement with our offshore subsidiaries. Companies that have revenue in, say, North America or Europe, but have offshore delivery centers in countries like India and the Philippines, put in place what’s called transfer pricing arrangements, this is to satisfy the arm’s length transaction principle. Under a transfer pricing arrangement, a percentage of revenue is allocated to the delivery center.
The percentage allocated is often determined by statute or regulation in the foreign country. We understand that the reason the foreign country does this is to make sure there are profits at the local level for it to tax. However, when the consolidated enterprise is losing money and would not otherwise have to pay taxes, it unfortunately ends up having to pay taxes offshore. Obviously, paying taxes when you are losing money is not a good thing and is referred to as tax leakage, but even in this situation, the tax we pay is insignificant versus the money we save by operating offshore. This business model is very common across many industries and not unique to Innodata. The last question that I’ve gotten is whether is there any structural reason that Innodata would be expected to lose more money as it generates more revenue?
The answer to this is absolutely not. As Innodata revenue increases, we expect that its adjusted EBITDA will increase at an even higher percentage. This is because there is some operating leverage in our direct costs, for things like production facilities and other fixed expenses and significant operating leverage in our general and administrative operating costs. We saw clear evidence of this in both Q3 and Q4. Like, in Q3, revenue grew sequentially by $2.5 million and adjusted EBITDA grew sequentially by $1.6 million. Similarly, in Q4, revenue grew sequentially by $3.9 million and adjusted EBITDA grew sequentially by $1.1 million. There will, however, be quarterly fluctuations on how much revenue falls to the EBITDA line based on how we flex our operating expenses, particularly our sales and marketing efforts, based on market dynamics.
Well, I hope I was able to address some of our investor queries. Again, thanks everyone. And I will now turn this over to John. John, we are now ready for questions.
See also 15 Beauty Treatments That Are Expensive But Totally Worth It and 20 Fastest Growing Energy Companies in the US.
Q&A Session
Follow Innodata Inc (NASDAQ:INOD)
Follow Innodata Inc (NASDAQ:INOD)
Operator: Thank you. [Operator Instructions] The first question comes from Tim Clarkson with Van Clemens. Please proceed.
Tim Clarkson: Hey, Jack. How are you doing?
Jack Abuhoff: Hey, Tim. Doing great.
Tim Clarkson: Good. Good. Well, I thought the quarter was outstanding. So just as a question, I’m going to have you answer it, but you’re going to answer it in a more sophisticated way than I’m going to say it. But I mean, when I originally learned about Innodata being involved in AI, Rahul told me, and this is what — he told me when the stock was at $1, he said, listen, the reason Innodata is going to be successful is they’re the most accurate. And at IBM, the reason we had so much trouble on 80% of our deals was inaccuracy. And so far, you’ve gotten a number of smaller contracts and now you’ve gotten the big contracts. It’s coming true. So to me, that’s maybe a real simple insight for some people who are intimidated by all the complexity of AI. But why don’t you explain in the simplest terms, how Innodata fits into AI?
Jack Abuhoff: Sure. Well, in a number of different ways, I think, to — and I don’t think your question is particularly unsophisticated. I think that exactly what you said is correct. The key to programming large language models is essentially the data engineering that goes into it and the principle of garbage in, garbage out, holds very much true. What I see that we’re doing a great job at is creating very high quality datasets that our customers are able to use and incorporate into large language models to get the performance from the models that they’re seeking. Instruction datasets that are key to helping the models understand prompts, to accept instruction, to converse, to reason, all of these things. And that’s how they’re competing.
They’re competing on the quality of the experience that their customers will have with the models that they’re building. So, to the extent that the data engineering that we provide to them is helping them achieve that, well, that obviously is a very, very good thing. Now, on top of data accuracy and data engineering, the thing that we’ve been focused on for so long now. I think we create the appropriate customer experience that they’re looking for. They’re figuring things out. They need a company that’s highly dynamic and that’s agile and that can stay with their engineering team, that can be responsive to the changing requirements that the engineering team has, and again, that’s something that’s firmly built into our culture.
So, we’re very proud of the results that we’re showing. We’re very proud of the quality of the partnerships that we’re achieving. I think, well, we announced that for one of the large deployments this quarter, we signed a three-year ongoing contract with a hopeful value of $69 million. It’s a huge achievement and what that came with was a lot of wonderful things that the customer had to say about us, about the value of the data, exactly like you just said, and about the quality of the experience that they have with us. So we think we’re doing good, we’re very well poised for an exciting year next year and we’re very excited about that.
Tim Clarkson: Right. Now, looking at your projections, I mean, you said last time you expect some $30 million quarters. It looks like based on what you did in the fourth quarter and in your growth rates, you’re approaching that sometime this year, right?
Jack Abuhoff: Well, I think, we’re going to stick with the guidance that we’re providing. Our intention is to surprise and delight our investors. We think we have the opportunity to do that.
Tim Clarkson: Right.
Jack Abuhoff: So the guidance that we’ve put out there is 20% growth, but with the intention of besting that…
Tim Clarkson: Sure.
Jack Abuhoff: I think we have a very good chance of being able to do that.
Tim Clarkson: Right. Right. Now when I look at the P&L, I know you like to look at EBITDA. I like to look at net after-tax. It seems to me that somewhere as you approach, say, $35 million, at $30 million, you start to net 10% to 15% after-tax and at $35 million, you start to approach more like 15% to 20% after-tax. Is that about right?
Jack Abuhoff: Yeah. We are not going to — there are a lot of things that go into the model. I think that we’re going to resist the temptation of kind of digging in and creating more of a model than we are. The guidance is what we’re saying. I think we intend to do better than that and perhaps…
Tim Clarkson: Right.
Jack Abuhoff: … significantly and I think the business is not that difficult to model. I’d encourage you to do it. I think we can create a lot of shareholder value this year.
Tim Clarkson: Right. And obviously, as sales go up, historically within a data, profitability has always gotten up on balance, not every quarter, but typically it goes up much faster than the revenues.
Jack Abuhoff: That’s correct. And I think you see that operating leverage working very strongly in both Q3 and Q4, and that operating leverage and the disproportionate increases that we see in profitability to revenue growth will work for us, will continue to work for us, I believe, and will give us the ability to further invest in the company and stay aligned with our market and ahead of our competitors and we think we’re managing the company appropriately from that perspective. We’re very happy, as we just said, to confirm that we don’t plan on needing to raise equity. We think that that’s a very strong statement for a company that has been able to keep pace with others of our competitors who are more significantly funded than we are and to compete aggressively with them and win deals against them. So we think we’re managing the opportunity appropriately and we think there’s a lot of good things ahead for us.