Tesla, Inc. (NASDAQ:TSLA) Q4 2022 Earnings Call Transcript

Alex Potter: Okay, great. And then maybe 1 additional question here on the incremental capacity in Nevada, the 4680s that you’re planning. That’s a lot of batteries obviously, and presumably, you won’t be putting all of those in Tesla Semi. So I guess, two questions about that incremental capacity. First, is it correct to assume that all of those 4680s are going to be more or less fungible and usable in your entire range of products? And if the answer is yes, then if you had to guess, how do you think that 100 gigawatt-hours would be allocated between your various end markets?

Elon Musk: I don’t know, this is a bit too much guessing

Andrew Baglino: Yes.

Elon Musk: But — yes, Yes. I mean, you’re right. Not all of the 100 gigawatt-hours are going to go into the Semi trucks, that is correct. Let’s say like — I alluded to a number of future products. Those future products would use the 4680.

Martin Viecha: Thank you. And the next question comes from George from Canaccord Research.

George Gianarikas: Hi, everyone. Thanks for taking my question. So you recently adjusted prices and that may have put many of your competitors in the back foot. In addition to that, capital markets have recently gotten a lot tougher. So with those factors in mind, I’m curious how you see the current competitive landscape changing over the next few years. And who do you see as your chief competitors five years from now?

Elon Musk: Five years is a long time. As with the Tesla order part, AI team, until late last night and just we’re just asking guys like, so who do we think is close to Tesla with — a general solution for self-driving? And we still don’t even know really who would even be a distant second. So, yes, it really seems like we’re — I mean, right now, I don’t think you could see a second place with a telescope, at least we can’t. So, that wouldn’t last forever. So, in five years, I don’t know, probably somebody has figured it out. I don’t think it’s any of the car companies that we’re aware of. But I’m just guessing that someone might be right out eventually, so yes.

Zachary Kirkhorn: I mean, beyond that, Elon, like in the vehicle space, even though the market is shrinking, we’re growing and EVs have doubled almost year-over-year. So, like it ever keeps up with the trend of EVs is going to be our competitor. The Chinese are scary; we always say that. But like a lot of people always look at the EV market share, but we always look at it is how much of the total vehicle space do we have, and we’re just going to keep growing in that space. There’s 95% for us to go get.

Elon Musk: Yes. And I don’t want to say like — I think we have a lot of respect for the car companies in China. They are the most competitive in the world, that is our experience and the Chinese market, it is the most competitive. They work the hardest and they work the smartest, that’s so for the China car companies that we’re competing against. And so we would guess, there are probably some company out of China as the most likely to be second to Tesla. We are — the Telsa China team is winning in China. And I think we actually are able to attract the best talent in China. So, hopefully, that continues. So, yes, so we’re fired about the future and well, it’s going to be great.

George Gianarikas: Just as a follow-up, the Inflation Reduction Act has created huge tax incentives for commercial vehicles. You mentioned an incredibly interesting product pipeline. Are there maybe some plans to accelerate commercial vehicle form factors outside of the Tesla Semi to help accelerate EV adoption?

Elon Musk: Well, I was basically saying that, yes, but I’m not going to give you details because this is — nice try, nice try. Yes, of course, of course. So, we actually look at like, what is the limiting factor for new vehicles because if the — for the longest time, we’ve been constrained on total cell lithium-ion production output. And so people said, like, why not bring this other car to market or that other car to market? Well, it doesn’t really help if all you’re doing is shuffling around the batteries from one car to another. In fact, it hurts because you add complexity, but you don’t add incremental volume. So, it’s sort of pointless, in fact, like counterproductive to add model complexity without solving the availability of lithium-ion batteries.

So, as we get — so we want new product introduction to match where the cells are available or that new product to use those cells without cannibalizing the cells of the other cars. That’s the actual limiting factor for new models, not anything else really.

Martin Viecha: Thank you. Let’s go to the next question. The next question comes from William Stein from Truist.

William Stein: Great. Thanks for taking my question. You started to answer this earlier, but I’d like to ask this question about the AI elements of your business and ask if you could comment on progress around Dojo and Optimus and your anticipation for the likelihood, for example, for the company to disconnect the GPU cluster in favor of Dojo and to have some market achievement an Optimus?

Elon Musk: Yes. I mean, obviously, with — just we’re still at the early stages, there are big in any predictions. It’s like — I think, easy to predict long-term, but hard to predict the time in between now and then. But it’s — we think Dojo will be competitive with the NVIDIA H1 at the end of this year and then hopefully surpass it next year. And the key there is — I think what’s the energy usage required for a given amount of — if you’re training a frame of video, how — what’s the energy cost required to do that training? And we think probably — we said this already actually at AI Day, so it’s not new information, but we do see potential for an order of magnitude improvement relative to GPU, what GPUs can do for Dojo, which is obviously very specialized for AI training.

It’s hyper-specialized for AI training. It’s not — wouldn’t be great for other things, but it should be extremely good for AI training. So just like if you do an ASIC or something, it’s going to be better than a CPU. This is sort of, in some ways, like a giant ASIC. And we’re able to — since we’re operating one of the biggest GPU clusters in the world already, the — we’ve got a good sense of how efficient the GPU clusters operate and what Dojo needs to do in order to be competitive. But we think that it does have a fundamental architectural advantage because it’s designed not to be — the GPU is trying to do many things for many people. We’re trying to do graphics, video games. It’s doing crypto mining. It’s doing a lot of things. Dojo is just doing one thing and that is training.

And we’re also optimizing the low-level software too. So it had a various sort of, ground middle level so it’s just insanely good at efficient training. And the intra-communication between the Dojo modules is extremely high. It’s not going across an Ethernet cable. It’s like — so anyway, the — we see a path to an order of magnitude improvement in the energy efficiency or per given unit of training. But we also have to achieve that. And so when will it be achieved? It’s hard to say, but we do see a path to get there. And then also on inference, like once you’ve got something trained, well, if you want to have a product that’s a consequence of that training, that product may not be anything to do with cars. Then the efficiency of inference is extremely important.

And we also have, by far, the most efficient inference computer at the — with the FSD computer in the car. This has potential for products that are in car even really in automotive.

Martin Viecha: Thank you. And William, do you have a follow-up?