10 Best Machine Learning Stocks According to Analysts

8. Symbotic Inc. (NASDAQ:SYM)

Number of Hedge Fund Investors  in Q1 2024: 24

Analyst Average Share Price Target: $55.38

Upside: 40%

Symbotic Inc. (NASDAQ:SYM) is a specialty process automation firm which focuses on the needs of the warehousing and the logistics industry. It offers machine learning to its customers through platforms such as its AI powered robotic warehouse process automation products. Symbotic Inc. (NASDAQ:SYM) is one of the few firms of its kind with a pure play focus on warehouse automation, which offers it a significant competitive moat. This is evidenced by the firm’s $23 billion order backlog. Its financial heft is further bolstered by $546 million in cash and no long term debt, allowing Symbotic Inc. (NASDAQ:SYM) to ensure a steady flow of revenue even as the broader cloud sector struggles. One of Symbotic Inc. (NASDAQ:SYM)’s biggest customers is the retail giant Walmart, which has teamed up with the firm to deploy AI automation across its warehouses.

One key application of machine learning is image recognition, and on this front, here’s what Symbotic Inc. (NASDAQ:SYM)’s management had to say during its Q2 2024 earnings call:

So we’ve been working on this for about three years now, but putting vision on our bots and that combined with NVIDIA chips that we’re using, allows us to recognize boxes that may be deformed, but still recognize what the product is. And so that makes our bots much more able to pick irregular cases. And as you know, we’re one of the only people, maybe the only one that puts our box directly on shelves. We don’t put them on trays. That requires a lot of expertise and a lot of knowledge. And so we’ve been on this journey for a while, and now about 40% of our bots in our network are vision-enabled. And so there’s a bunch of work for the AI to catch up with. Recognizing 1,000 different pictures of a single box and saying, oh, that’s XY’s product.

And the shape, I didn’t recognize it before because we were just using sensors, but now with vision, we can actually recognize that. So that’s one thing. The other thing we did is that we changed the routing algorithms for our bots, and they will also be vision-enabled so that they’re more reliable, so that if something happens, like a bot gets stuck on, a broken case or something that we can now route around it. And we always could do that a little bit, but now we can do it much better. And then to be able to actually see the bot in front of us is also innovative. The other thing we’ve done is we have started work on perishable testing. And so we think that’s going to go fairly well because there’s not a lot of new things we have to do on perishables, but we want to test what happens when a bot runs over yogurt.

So things like that, and then the next thing after that will be testing bots in a frozen environment. So those are a couple of things we’ve been doing.