It will allow also additional — the machine learning allows analytics that basically will learn better what’s going on in real time once you sense everything much more than when just pointing a scanner or a barcode, now you see many more things that happen in that image. And this is how you create many more efficiencies and become even proactive and actually expect and prevent problems from recurring in the future for that same customer. So It’s a long answer. But Neil, it’s very simple, is we are going to take away the scanning from the wall and replace it with machine vision. And that’s why we always say that $1 billion valuation is just the first milestone for us.
Unidentified Analyst: Okay. Well, Shai, that’s great. And I think the long answer was exactly what I was looking for, so we can visualize where you are and where you want to go. Let me make my last question on this same topic. Do you have the capability today to do what you just described the future of inventory and supply chain management is going to be? Can you do that today? And do you think we’ll see some of these large industry recognizable companies make the move to the next generation by the end of 2024? Or is this something down the road three, four years?
Shai Lustgarten: So we are — I’ll start from the last portion of the question. Our objective is to already make and do something this year, not wait for 2024 or down the road couple of years. We want to — that’s one of our objectives for this year. And to answer the first portion of your question, we have — I would say we have the solution — yes, we do have the solution but we are today — not but — and we are today using to complete it with additional third party company, for example, for the hardware same as we did today with — in parking or Q Shield in cities. We’re not manufacturing the sensor. We are in many situations assembled the sensor, but we don’t want to deal with that hardware, we want to focus on software.
Here is the same thing. We have the software. We have the algorithms. We know how to train the engine already to identify and we proved that we don’t need a scanner to read. We already did that. We can read barcodes by smart glasses with smart glasses rather than using anything else. And then what we are — so we have the capability to train our engine, the algorithms to do that. And we are now expanding it to also identify the additional parameters of a product of our customer. For example, if it’s – if it’s a tire, then we already trained our engine how to identify what is a tire and also all the markings on the tire, the date, the batch number, etcetera. So now it’s not only, for example, I’ll just talk about it in 10 seconds and that’s it.
Today’s problems with identifying tires is that it has the way to identify tires, it has barcode stickers on them. When you send tens of thousands of tires in the container, it peels up, it turns up, it’s torn from the tire. And many, many, many times, it seems weird, but even huge, I don’t want to mention names that we’re talking to, but the largest companies in the world that manufacturers don’t know what’s going on into the stores. They know the quantity, but they don’t know all the other material. We already have trained our engine to read all the marks, identify each tire and do this on the fly. So to answer your question, yes and that’s one example, something that we already did.
Unidentified Analyst: Okay. All right. And Shai, I’ll hang up after this. But is anybody beta testing this? Is anyone – is anyone now currently doing internal testing of what the solution would look like for them? Or is this all yet to come?
Shai Lustgarten: No, we started beta testing with one customer. It did stop due to them coming after Covid and trying to rearrange things, et cetera. But we did start the process and looking forward to not only continue that with that customer but actually continue with the two others.