Dou Shen: Great question Wei Xiong. As you actually already noticed, right, the traditional cloud business is slowing down, while generative AI and large language models are sliding in and reshaping the competitive landscape of the cloud business industry. In the past, right, the focus in the cloud market was actually on house, which is like a commodity and people are competing on pricing. But now with the rise of generative AI and the large language models, things are changing. There is a growing interest among cloud customers coming to Baidu to utilize these sophisticated technologies to increase productivity and efficiency. They came to us not only because we have the most advanced AI technology, but also because we have experience and track record in using AI to help enterprises to solve problems.
There are some of them who are still in the product experimental stages, but we have firm belief in the new technology to rebuild their products and services, because we have seen successful stories in overseas. That is why we are seeing the new technology is increasing our time and expanding our competitive edge. So, as Robin mentioned, EB4 is China’s first GPT-4 style model. We also shared the positive initial feedbacks we had garnered for EB4. So, currently our team are engaged in dialogues with our clients, assisting them in understanding the technology and utilizing ERNIE to redevelop their existing products and create new ones. So, I can say that ERNIE has already helped us attract new customers and additional IT spending from existing customers.
So, here I would like to briefly bring up two points for our advantage compared to the players in the market. The first one is our unique four-layer AI infrastructure, which gives us the flexibility to make adjustments or innovations at every layer to be able to be compatible with other leaders to keep driving efficiency in both model training and inference. And the second one, more specifically, is our capability to develop GPU networks for clusters for large language model training. So, as Robin just said, 98% of the training time on our AI infrastructure is valid, as a result in our customers, including several leading Internet and tech companies for increasing their investments in our service. Furthermore, we will continue to leverage our unique advantage of AI architecture to drive efficiency gains.
It will help us to greatly reduce costs in our model training and inference on our cloud, giving us the flexibility to offer more compelling prices to our customers and further strengthen our competitive edge in the market. Regarding the competition from telecom operators, I would like to highlight our focus is on different market segments since we differentiate ourselves with our AI capabilities, particularly earnings as we have mentioned. At least worth noting that we can cooperate and we are actually cooperating on many projects and as for other Internet companies in China, so our strong AI capabilities and ERNIE are well recognized by the market, which will set us apart from our peers. To sum up here, we believe that our strong AI capabilities and particularly in generative AI and large language models will allow us to eventually become the end market leader and gain share in the cloud market.
Operator: The next question comes from Lincoln Kong with Goldman Sachs. Please go ahead.
Lincoln Kong: Thank you, management for taking my question. So, my question is also about ERNIE, so given the successful upgrade of ERNIE 4.0, what would be the future strategy for model iteration to certify our tech leadership? So, do we foresee any competition in the foundation model in the industry, either to stabilize or intensify in the future? Thank you.
Robin Li: Hi Lincoln, the AI chips we have in hand already allow us to launch EB4. We are ahead of the competition. To take our lead in LLMs to the next level, we will take an application-driven approach. We will let the AI native apps tell us what to improve in ERNIE Bot capabilities. Given that there are only a very limited number of AI native apps on the market right now, the majority of ERNIE API cost from internal apps, our internal apps like search apps, Wenku, etcetera. The rebuilding and restructuring of our existing products drive ERNIE innovation in the right direction. What is equally important is that we are helping enterprises use ERNIE to build their offerings. And we have seen that over 10,000 enterprises are using ERNIE through API costs on a monthly basis, which propels ERNIE’s improvement too.
We also continue to improve the efficiency of our models. For example, compared to ERNIE Bot version in March, interest cost of the current version has been reduced by 98%, basically resulting in a 50x increase in QPS for the same amount of hardware activated power. We are also able to do this using our unique four-layer architecture and leveraging our ability to do end-to-end optimization. Continued inference cost reduction has further strengthened our model’s competitive advantage, and it gives us the flexibility to offer more and more compelling presence. From a long-term perspective, taking into account factors such as the scarcity of high-performance chips, high demand for data, AI talent and the huge upfront investments, the industry will soon translate into a consolidation stage.
We believe there will only be a select few foundation models in the market, and Baidu will certainly be one of them. In this stage of industry development, more and more enterprises will begin to leverage advanced foundation models like ERNIE to create AI products rather than spending resources on building their own large language models. So, we expect that the number of native apps based on ERNIE will reach millions in the future.
Lincoln Kong: Thank you.
Operator: The next question comes from James Lee with Mizuho. Please go ahead.