James Lee: Great. Thanks for taking my questions. Can you guys maybe quantify the investments related to AI and how that affects various cost items in your P&L, and should we expect these investments to accelerate over the next few quarters. I was thinking especially at the launch of ERNIE 4.0 and potentially higher inference costs as multiple using it. And then if we extrapolate that over a longer term, how should we think about Baidu core OPM over the next few years, given all the moving parts, including revenue shift, investment AI and also your continued improvement in cloud profitability. Thanks.
Rong Luo: Hi James. Let me take your questions. This is Julius. Currently, the primary investments for generative AI and large language models is centered around the computing power, which is recorded as part of CapEx. I think in the past few quarters, we have put a lot of chips resources in training our new ERNIE models. And in the future, as more AI native applications, which is powered by our ERNIE can more widely used, which also more will put resources in small [indiscernible]. However, please note that all of this impact for the AI-related our investments all came out is quite manageable because all the hardware depreciation are spread on over a few years. And for example, all expenses linked to the coating power, we used in the training ERNIE are recorded through IND depreciations.
And the model inferencing courses, which is highly related to the usage of the model either internally or externally, and it should be supporting the funding by the future developments. And moreover, we are happy to see that our investment in generative AI and large language models are beginning to bear fruit. As Robin has mentioned just earlier, since we are receiving the approval from regulatory and more additional revenue generated from earnings powered by 2C or 2B businesses has been growing quite fast. While we are using the generative AI and large language models to renovate our businesses, we are still keeping a close eye on making sure our Baidu cost earnings stay solid. In Q3, we can see that mobile ecosystem continues its high margin, ensuring a very strong generation of cash flow.
And AI cloud business continued its healthy growth and achieved profitability one more time. And look ahead, we expect the traditional cloud business to remain quite profitable and the new opportunity arising from generative AI and language models are also expect to have favorable margin in the long-term. For intelligent driving businesses, our long-term growth opportunity will continue to invest with a mature pace. And all-in-all, we will concentrate our resources by reallocating them from non-core businesses to AI-related businesses. All of these were quite beneficial for our long-term growth. Thank you, James.
Operator: The next question comes from Thomas Chong with Jefferies. Please go ahead.
Thomas Chong: Hi. Good evening. Thanks management for taking my question and congratulations on a solid set of results. My question is on the chips side. Can management comment about the impact on AI development after the further restriction of the chip export from the U.S.? How does that affect our AI product offerings and user experience, if any? Thank you.
Robin Li: Yes, the restrictions on the chip export to China actually have limited impact on Baidu in the near-term. We have successfully launched EB4 in mid-October, our most advanced foundation model in China. It is a milestone for us. And as I have just said earlier, we have a substantial reserve of AI chips, which can help us keep improving ERNIE Bot for the next year or 2 years. Also inference requires less powerful chips, and we believe our chip reserves as well as other alternatives will be sufficient to support a lot of AI native apps for the end users. And in the long run, having difficulties in acquiring the most advanced chips inevitably impacts the pace of AI divestment in China. So, we are proactively seeking alternatives.
While these options are not as advanced as the best chips in the U.S., our unique four-layer AI architecture and strength in AI algorithm will continue to help us improve efficiency and mitigate some of these challenges. For example, we have made some innovations in [indiscernible], our deep learning framework and ERNIE Bot foundation model to allow them to be better compatible with different parts of AI chips, both model training and inference tasks. But given that all the other Chinese companies face the same challenge, we believe we are actually best positioned to service market. As you probably know, in the past, some of our peers, they tried to write on the Gen AI wave by investing in those startups to train foundation models. And they – basically we sell the computing power to those start-ups.