Twitter

Link your Twitter Account to Market Wire News


When you linking your Twitter Account Market Wire News Trending Stocks news and your Portfolio Stocks News will automatically tweet from your Twitter account.


Be alerted of any news about your stocks and see what other stocks are trending.



home / news releases / ARBEW - Arbe Robotics Ltd. (ARBE) Q4 2022 Earnings Call Transcript


ARBEW - Arbe Robotics Ltd. (ARBE) Q4 2022 Earnings Call Transcript

Arbe Robotics Ltd. (ARBE)

Q4 2022 Earnings Conference Call

March 2, 2023, 8:30 AM ET

Company Participants

Kobi Marenko - Co-Founder and CEO

Karine Pinto-Flomenboim - Chief Financial Officer

Ram Machness - Chief Business Officer

Noam Arkind - Co-Founder and CTO

Conference Call Participants

Gary Mobley - Wells Fargo

Josh Buchalter - Cowen

Matthew Galinko - Maxim

Jaime Perez - RF Lafferty

Presentation

Kobi Marenko

Thank you everyone for joining us today. Welcome to Arbe’s Fourth Quarter and Full Year 2022 Financial Results Webcast. My name is Kobi Marenko and I am the Co-Founder and CEO of Arbe. I am very excited to share the developments made by Arbe in 2022, especially in Q4, where we have progress from the proof-of-concept phase to the production and commercial deployment stage. After my presentation, Karine Pinto-Flomenboim, Arbe’s CFO, will share a view of our financials and outlook. Next, Ram Machness, our Chief Business Officer, will discuss the market forecast and the business opportunities that we are pursuing. Finally, Noam Arkind, our CTO and my Co-Founder, will conclude the presentations and speak about Arbe’s latest innovations and our vision for the future. We value your input and questions. So we will reserve time for a questions-and-answer session. Please take a minute to review the Safe Harbor statement.

Throughout the year, Arbe was dedicated to maturing a cutting-edge perception technology that significantly improves vehicle safety and accelerates the realization of Level 2+ advanced driver assistant vehicles. Our company’s progress has parallel the advancement of the vehicle market to Level 2+, which we anticipate will become the industry standard in the automotive by 2025.

In 2022, we actively collaborated with leading automakers perception teams to ensure that our best innovations are designed into their groundbreaking revolution. We received clear indications that our Perception Radar technology will be a crucial enabler an integral component of Level 2+ as the mass market increasingly adopts autonomous systems at Level 2+, Arbe become a vital element of the overall solution with a cost-effective sensor.

We have been securing global Tier 1 partnerships and commitments to accelerate the transition to advanced vehicle safety systems in the near future. As we look ahead, we remain on track to achieve full production by the fourth quarter of 2023, thanks to the hard work and dedication of our team and our business partners.

In 2022, we collaborated with multiple global Tier 1 suppliers who will position Arbe for success in 2023 and beyond. Veoneer, a world leader in automotive safety, selected Arbe’s chipset for their next-generation radars. Veoneer is an attractive partner for Arbe as it currently produces more than 50 million radars per year and forecasting to grow to roughly $250 million per year by the end of the decade.

In addition, Arbe received its first mass production preliminary order for 340,000 chipsets from HiRain Technologies, a leading Chinese ADAS Tier 1 supplier. In Q3, we announced that HiRain was selected to provide Perception Radars based on the Arbe chipset for an autonomous truck project in the Ports of China. This announcement validates Arbe’s leadership as trucks require the highest standards of safety and have the biggest need for advanced sensing. The HiRain business partnership will allow us to quickly scale revenue, deploying Arbe products in China, the largest and fastest growing EV market in the world.

Part of our success today is a direct result of the strong relationships we have built with industry leaders such as Valeo, Veoneer, HiRain and Weifu suppliers to leading global automakers. This quarter marks a significant milestone for our Tier 1 business partners as these companies have made considerable financial investments and have deployed dedicated and sizable teams to develop radar systems, which utilize the cutting-edge radar chipset.

The Tier 1s are also developing software, utilizing the proprietary data generated by Arbe radars to enhance safety for OEMs. We are confident that these advancements will make a significant impact, not only for Arbe, but for the entire industry. We want to thank our Tier 1 partners for choosing and trusting Arbe and for working hard to secure customer wins. We look forward to updating you on the specific ones.

In addition to our Tier 1 partners, we are very pleased with our collaborations with passengers and commercial vehicle makers in robotaxi companies. In fact, today, we can report that we are actively engaged with 12 out of the top 15 automakers worldwide conducting field trials with them and participating in RFI and RFQ bids.

Our goal for 2023 is to achieve two design-ins in the rapidly expanding Chinese market and two design-ins with OEMs from Europe and the United States. These design-ins wins will allow multiyear revenue contracts. In the following presentation, Ram Machness, our Chief Business Officer, will provide details about our forecast for 2023.

As some of you may know, the automotive industry in Japan is known for its high standards and regulations, subjecting all automotive technologies to a strict certification process before allowing commercial evaluation in the region.

I am pleased to announce that Arbe has obtained a Japanese Telecommunication and Radio Certification for our mass production RF chipset with a certification process finalized a leading Japanese automotive company has begun a development project based on our chips.

We are also conducting pilot programs with leading OEMs and Tier 1s in the Japanese market, one of the top automotive markets in the world. We believe this very important certification will open a new and large opportunity for our company.

As mentioned, our focus in 2022 and 2023 is on the transition from proof-of-concept to mass production. We have completed our dedicated chipset production line with Global Foundries in Vermont USA. In this mass production line, our chips are now going through the AEC-Q100 qualification process, which is required automotive level qualification representing the highest safety standard, for example, testing the chips durability in 125 degrees Celsius to minus 40 degrees Celsius. This is an important step keeping us on track with our plan to begin mass production in the fourth quarter of 2023.

At the end of 2022, we launched our groundbreaking 360 degrees Radar-Based Perception Solution, which provides a comprehensive analysis of vehicles surroundings over a long range. We take pride in the fact that we are the first company to offer an integrated 360 degrees Radar-Based Perception Solution. Our suite of Perception Radars captures surround data, utilizing advanced AI technology to identify, classify and track objects within the entire field of view. The data is proceeds in real time to create a full free space map around the vehicle and provides an analysis of the evolving hazards detected by the radars. We are excited to offer this innovative and cost-effective solution to the market and we believe it will set a new industry standard for vehicle perception systems.

Arbe experienced great success at the CES Trade Show in Las Vegas, one of the industry’s most important events. We had the pleasure of meeting with most of our automakers we are actively engaged with who we assured as. They have secured the necessary budgets to working on products based on our cutting-edge chipset throughout 2023. These projects are focused on driving significant advancements in vehicle safety and we are proud to partner with our customers to bring these innovations to the market.

In addition, CES was a fantastic opportunity to meet with potential new customers. We were pleased to learn that Arbe’s leading innovations and value proposition resonates with the needs of these customers.

We are also proud to announce that Arbe has once again been awarded the prestigious CES Innovation Award for the 360 degrees Radar-Based Perception. We are thrilled to be recognized for our pioneering work in this field that will advance automotive safety in the near future.

Looking forward to 2023, our goal is clear. We expect to enter full production with our cutting-edge chips in the fourth quarter. In fact, we have already received orders that meet our 2023 production capacity. In addition to our production goals, we continue to focus on ambitious sales and business development objectives of OEM design-ins for the year ahead.

Our disruptive solutions have quickly gained recognition and endorsements from industry leaders and we believe that our strong offering will enable us to gain market share and drive innovation forward.

At this point, I would like to hand over to our CFO, Karine, who will provide a detailed overview of our financial performance.

Karine Pinto-Flomenboim

Thank you, Kobi, and hello, everyone. Let me review our financial results for the fourth quarter and the full year of 2022 in more detail. As Kobi said earlier, Arbe is a company in transition from development to production and our financial results in Q4 reflect this.

As we progress with our strategy, we are shifting our focus on to chips for production. As a result, we have decreased engineering sample sales during Q4, this transition will streamline our operations and provide cost savings as we work to adjust our processes and ramp up production accordingly. We believe that these decisions will enable us to better serve our customers and drive innovation forward.

Total revenue for the first quarter was $0.15 million, compared to $0.5 million in the fourth quarter of 2021. For the full year of 2022, total revenue was $3.5 million within our guidance and an increase of 56%, compared to $2.2 million in 2021.

Gross margin held Q4 -- held negative gross margin of 45.8%, compared to a positive gross margin of 37.7% in Q4 2021. This negative margin is another response to our reduced quarterly revenue as we transition to mass production. Gross margin for the full year of 2022 increased to 63.5%, compared to 36% in 2021. 2022 gross margin improvement was driven mainly by economy of scale, revenue mix and lower cost per unit as we progress towards production.

Moving on to expenses. In Q4 2022, we reported total operating expenses of $14 million, compared to $14.2 million in Q4 2021. Decrease in our pre-production related costs and favorable impact of foreign currency exchange rates were offset by an increase in labor cost and noncash share-based compensation expenses.

Operating expenses for the full year totaled to $50 million, compared to $34.1 million in 2021. The increase was in line with our expectations, as we continue to grow the company and to add to our employee base to support our future growth.

The company continues strengthening its research and development investment with R&D expenses totaling at $10.8 million in Q4 2022, compared to $11.6 million in Q4 2021 related to a decrease in pre-production costs. R&D expenses for the full year were $36.7 million, compared to $28.6 million in 2021.

Operating loss for the full year of 2022 was $47.7 million, compared to a loss of $33.3 million in 2021. Operating loss reflected our growing investment mainly in research and development in our human assets and in costs associated with being a publicly traded corporation, all towards our progress for production.

Looking at adjusted EBITDA in Q4 of 2022, a non-GAAP measurement, which excludes expenses for non-cash share-based compensation and for non-recurring items, was a loss of $11.5 million, also within our guidance and compared to a loss of $11.9 million in the fourth quarter of 2021. Adjusted EBITDA for the full year 2022 amounted to a loss of $38 million, compared to a loss of $30.4 million in 2021.

Net loss in the fourth quarter of 2022 decreased to $11.1 million with $3 million of financial income, compared to a net loss of $15.8 million in the fourth quarter of 2021. Net loss for the full year of 2022 was $40.5 million, compared to $58.1 million in 2021.

Moving to our balance sheet. As of December 31, 2022, Arbe had $54.2 million in cash and cash equivalents with no debt.

With respect to our guidance for 2023, we would like to provide an outlook for the full year ending December 31, 2023. Our goal for 2023 is to achieve four design-ins with OEMs. Revenues are expected to be in the range of $5 million to $7 million, representing our expectations of full production in Q4 2023 together with our decision to focus exclusively on production intent chips.

Adjusted EBITDA is expected to be in the range of $32 million loss to $35 million loss primarily related to revenue year-over-year growth, as well as decrease in initial production costs and cost efficiency efforts.

I am now pleased to hand over the floor to our Chief Business Officer, Ram Machness, who will share insights of our market forecast and the business opportunity that lies ahead.

Ram Machness

Thank you, Karine. I am Ram Machness, and I am the Chief Business Officer at Arbe. We would like to provide you some outlook for 2023 and summarize 2022. We will talk a bit about the market trends that we have seen in 2022. We will talk about -- a bit about our business model, about what’s next, where we see the main trends and where we see our product winning in the market.

So when we look at the market and the trends for 2022, we see a lot of changes in the autonomous driving arena. We saw a lot of buzz around the autonomous driving, but we see that the whole solution providing a full autonomous driving. This is a trend that is diminishing and we are looking at trends of features and scenarios being developed.

So instead of providing a full solution that is working in any scenarios, in any road conditions, in any weather conditions and it’s fully unsupervised. We are seeing the trend of developing smaller features, developing capabilities that are more for specific scenarios that are working in specific conditions and are sometimes supervised and sometimes even just alerting the driver for something that is happening around it.

So if, for example, in the past, we wanted to the car to drive all around, all the time, now we are talking about scenarios with a very clear entry and exit criteria. For example, when I am driving -- doing -- might drive in a highway or in a traffic jam, in specific weather condition, if it’s day light or if it’s night or a fog and it can be partially supervised by the driver, or for example, just alerting the driver about something dangerous that is about to happen.

So the focus of the industry became from solving the entire solution just to solving pieces of the puzzle on at a time and providing the means for the algorithms, for the developers to be able to solve the problems.

So we are seeing many of the OEMs, the car manufacturers looking at providing the bits and pieces to build these solutions. So it’s a very gradual solution. We saw a lot of software defined vehicles trend. That means software that is evolving during the ownership of the car.

So when you buy it, it doesn’t necessarily do all the features, but gradually, you can upgrade. And from the car manufacturers, they want to make sure that the hardware that they are putting on the car is capable of doing the advanced features later on.

So it’s actually looking at making sure that the hardware itself is enabler for future technology and it’s not an obstacle that prevents the car manufacturer from providing the full features later on. So some of the features, they are not ready yet, but the car manufacturer want to make sure that the car is ready to include those features in the future.

So we are seeing selections by OEM of hardware platforms, if it’s the compute platform, if it’s the sensing platform by itself that is future ready for the software and algorithms that are coming down the road.

What we also see is that across all the OEMs, the understanding the Imaging Radar, the capability to have an independent source of information to the autonomous features and to the autonomous driving capabilities is a fundamental element that the manu -- car manufacturers are looking to add into the vehicles.

We see it not only coming from us. We see also other players in the market like a Mobileye that as saying that Imaging Radar is a crucial element in being able to provide those features. And we saw even some other players that in the past said that the radar role is not that significant playing a much bigger emphasis right now on Imaging Radar and saying that Imaging Radar is necessary in order to be able to provide those advanced feature coming down the road.

If we look at the market from the perspective of Arbe, we are seeing a headway from the different OEM in 2023. They are selecting this year a lot of their hardware platform, a lot of their sensing platform, making sure that this is ready for those features. They are looking to make sure that from a regulatory perspective they are ready to meet all the end cap, all the requirements from all the regulation. And we are today in contact with 12 out of 15 global OEMs of the top OEMs that are in contact with Arbe looking at integration of the Arbe radars into next platforms on them. We also announced our penetration to the Japanese market. We passed the Japanese regulation and certification of a radar based on our chipset to start working in these markets as well.

When we look at the radars and what Arbe is bringing into the table, we are talking about Perception Radar, which is very, very different than the regular dollars that we are used to use till now.

So if you look at the low-end radars that are out there today, the traditional radars, if we take an example of a bus and pedestrian walking just next to the business, with traditional radar you won’t be able to see two objects. You will just see something that is going on there. You might be able to see the velocity of these objects, but you won’t be able to really separate them and you will be able to detect them only when they are moving. If they are stationary, if they are not moving, you won’t be able to detect them.

With the basic Imaging Radars that are coming now into the OEMs and the car manufacturers, you might be able to see that there is a bigger object maybe sometimes identify them as stationary objects, but not really separate them and be able to see two distinct objects even if they are stationary and that’s what the Perception Radar is really bringing into the table.

So it’s really bringing an image of the outside world into the perception algorithms that need an independent source of information. They have today the camera, but the camera by itself is not enough, because it’s not bulletproof. It’s not ready for any weather. It’s not ready at a night. And sometimes, it doesn’t bring the right analysis of the scene around it. So you need another independent source of information and only Perception Radars that are high-end Imaging Radar, they would be able to provide the details that are needed for the perception algorithms.

So if we look at the current generation radars, many of them, they are good for the adaptive cost control, maybe for some emergency braking, because they can see if there is something in front of you, they won’t be able to understand exactly the size of it, they won’t be able to understand exactly if it’s a stationary, they won’t be able to identify it even if it’s a stationary versus the Perception Radar that can clearly show the boundaries of these objects, can clearly show the free space, whether I can squeeze in between two different vehicles, whether the pedestrian is walking next to a track. So I will be able to see the truck and separately the pedestrian.

If someone -- there is a flat tire on the Autoban and someone is changing the tire. We will be able to see it with a Perception Radar and we won’t be able to see it with the basic Imaging Radar or with the low-end radars, traditional radars that are out there today.

So this is a fundamental revolution to the sensing capabilities that the perception teams are getting with the Imaging Radar. And as we said and other players are saying, now that you need a few thousands of channels to be able to really process this information and get detailed image in order to be able to really leverage this information.

When we look at the automotive market, it works with OEMs. These are the regular players that we are all familiar with that have the big brands of car manufacturers and there are Tier 1s. Tier 1s are the ones that are providing the hardware to these OEMs.

So when we -- in Arbe is a chip manufacturer, our main customer in the automotive market are the Tier 1s and we have four Tier 1s that publicly announced that they are basing their next-generation radars based on our chipset. It’s Veoneer, who is a global leader in radar. Valeo was -- who are leaders in ADAS. Weifu and HiRain that are Chinese leaders in their prospective markets. And we have also a Tier 1 coming to the non-retail automotive market, taking it to a commercial vehicle like trucks, delivery robots, agriculture, security use cases other than the automotive.

So, overall, we are seeing that from our perspective, the main customers are these big Tier 1s and we were able to announce these four Tier 1s that already based their next generation based on Arbe chipset.

Overall, we are seeing these Tier 1s putting tens of engineers, sometimes even 100 engineers working and developing on our chipset, their next-generation radar and we see a lot of momentum and a lot of mutual work between us and these Tier 1s. Although, all these five Tier 1s, they have already in A sample radar based on our chipset. There are two Tier 1s that already have a B sample and two other Tier 1s that are going to have a B sample really soon, and for the rest of the year, all of them would have their B sample.

Also, we announced publicly AutoX and Biker using our chipset as OEMs. Unfortunately, I cannot talk about other OEMs right now of their selection and we do have an active project with four OEMs, and overall, there are right now 10 active RFIs, RFQs with different global OEMs with the Tier 1s that are using our chipsets.

Going on to 2023, we are expecting to be able to see two more significant OEMs signing a deal for production with our chipset. We are expecting two new Chinese OEM signing a deal for production with a chipset.

In the non-retail automotive we are expecting five more deals coming down the path. We will be an automotive grade production in Q4 of this year of 2023, automotive grade full production. And until then, we are expecting the revenue to come mainly from evaluation systems, from some samples and from projects that we are doing together with the OEMs and with the different Tier 1s. Overall, we are expecting the revenue between $5 million to $7 million in 2023.

So I would like to introduce Chief Technology Officer, Noam Arkind, who will provide the insights into the groundbreaking innovations that Arbe has developed and what lies ahead of our company.

Noam Arkind

Thank you, Ram. Hello, everybody, and I am Noam. I am the Co-Founder and the CTO at Arbe. Today, I am going to show you our technology. So radar in general is the perfect sensor of automotive. We all know that it has great features. It’s active in the day and night. It’s not affected by weather conditions. It’s a very sensitive. It can see targets very far away. It has a high refresh rate. And it directly measures four dimensions azimuthalization [ph] range doppler and it’s extremely reliable and affordable. And this is why this technology was introduced a long time ago into the automotive industry and it’s already very mature with over 20 years into the industry.

Now what we do at Arbe, we took this technology and brought it to the edge in terms of performance. So the technology that Arbe provides is the best radar that is possible within the limit of the automotive industry. So we achieved -- and we achieved this using massive MIMO concept and this means that we have a lot of channels inside the radar that we can process, we can use them in process.

So our main radar is what we use for -- what we aim for the front of the car and the back some -- in some of applications, we use 48 transmitters and 48 receive channels. And we also have a version which is scaled down of that version, which we aim for the size of the radar, which is 24 transmitters and 12 receivers.

Now in radar, channels -- number of channels means performance. It’s really equivalent. So you can think about it like the equivalent of camera pixels. So imagine that you get a camera with 2,000 pixels and you want to compare it to other cameras that, this is the radars that exist on the market with 10 pixels or 100 pixels. So we get 10 times better performance than the state-of-the-art that’s available today in the market and this is because of the breakthrough we were able to achieve.

So this -- in order to achieve this, this is a very hard goal and this was very hard to get to. And the way that we got there is by -- we had to innovate in every aspect of the radar system design and we had to design the core component of the radar system by ourselves. So this is why we designed our own radar chipset, which includes a receiver chip, a transmitter chip and the digital chip for the processing.

In this digital chip, we have what we call a radar processing unit, which is a dedicated IP embedded in the silicon that really puts all of our best ideas and innovation into something which is cost efficient and high performing processing core.

We also had to innovate on the way the radar works from the physical point of view. So we had to invent a new type of modulation. And all of this is part of our core IP that is the basis to how we achieve this high performance.

Dissensor [ph] is critical for a wide range of driving scenarios, starting from the most basic applications just obstacle avoidance. So the Radar Dissensor is very good at mapping the environment and understanding all the cars around the -- your own car and the obstacles, the stationary objects that we have on the road, maybe something fall out of the truck in front of you and things like that. So the radar is very sensitive, it’s very good at detecting this, mapping them and allowing the car to avoid that.

But this is not the case for all type of radars. If you have a low resolution radar, it’s very hard to really understand the environment to create this map. So this is an innovation that is enabled only because we have this technology with this high resolution and high channel counts.

And similarly, like getting into an intersection, you have to see long range and high resolution in order to understand if there is a car coming in your lane or the lane you are trying to get into. It’s very hard task for radar and this is a thing that we are enabling with our technology.

But we are also looking on edge cases. So one of the most problematic edge cases for this technology that is radar is the truck stuck under a bridge because the bridge and the truck gets mixed together. It’s very hard to understand if the road is clear, you can drive there or if the road is block for normal radar. But because of our technology, this use case is also solved and we can provide a map with higher accuracy also in the elevation dimension.

One of the other very important use cases, which is very hard as we know and we see cars today failing it all the time is the VIU dictator. VIU is vulnerable road users. So specifically, we talk about pedestrians and motorcycle and motorbike riders that are harder to detect. The radar signature is low. It’s harder to detect them and they move in very slow speed. So we can’t really use the motion as a detection principle. We have to get this high resolution and this is something, again, that our technology solves allow the car to really have another sensor like the camera that can see those road users.

So this is a demo of how the technology really works. You can see that we are able to map the entire environment in 360 degrees coverage around the car. And in real-time, we are able to create this high resolution map to allow the car to navigate in the road in the world. It’s -- this application is called Simultaneous Localization and Mapping, and this is considered one of the most difficult applications doing on sensor-only -- radar-only stack and we can -- we prove now that this is possible with our technology and this could be the basic for really planning the trajectory of the car for autonomous driving, but also for simpler ADAS functions like emergency braking and lane changing once you can get this high-resolution map with that perception, and of course, velocity perception, doppler perception, then you can really make conscious and smart decisions while you drive.

What we do in order -- we created a very strong patent portfolio that protects us and really covers the core technology. And we are using the time we have now from the delays in the market to further extend it and to improve our technology, improve our competitive advantage and to look also on the future.

So we have core patents. We have nine granted patents around our core technology, which is around the modulation and the processing. How do we do the processing exactly of the radar is a very new approach that is not straightforward, it’s not something that you take out of the radar book. We have to innovate on that. On how you design the system. How you design the antenna array. How you design the package for the RF chips, because in order to get the cost down, we had to package to integrate many channels together in a low -- in a small silicon area and small package area. So the way to do it is not that trivial, it’s quite hard and we also have some core IP in that regard.

Another problem, which is very fundamental to radar industry is mutual interference. So the radar is always operate in the same frequency band and they can interfere with each other. So we have invested a lot of effort and a lot of time in investigating how these interferences behave, how they look like and how they affect the system. And we generated also very strong IP in that field, which is going to be one of the limiting factors of the technology and we think we have a very good solution on how to cope with this interference problem.

Now looking forward on the milestones we have ahead of us in terms of technology and innovation is very exciting for me. All ready today, we have the richest point cloud, like when -- I am talking about the physical representation of the world. So this means that we are using the basic physical principles of the radar to generate the image.

But we know that this is not all we can do. We can take this information and we know that there are more secrets inside this information and we can really create more insights onto -- into the scene and the map around us from this information. So this is -- the data that we have today is rich and it’s a great basis for another stage which is creating a super resolution point cloud from the data that we have today and this is something that we are working on. And we already have a prototype of it and it will go into our production solution in this year, probably, in the middle of this year.

And the next what we are really trying to get is a really good representation, very accurate representation of the world. So if you compare it to other sensors, we know that autonomous driving stacks today rely a lot on LIDAR, which is very expensive sensor and has its own problems.

And we are trying to get with these techniques, with these projects we are now introducing, we are trying to get something which is similar to a LIDAR. But of course, the radar has its own advantages that I mentioned before and it’s also almost 10 times cheaper than LIDAR solutions.

So we believe that if we can get to a resolution and detailed image, which is close to what we can get from LIDAR, then we can offer this high-resolution radar as an alternative and reduce the cost of the autonomous driving stack for future systems and this is the key to getting this technology -- this autonomous driving technology into mass adoption in the market. So we see this as a very critical stage to get this -- to make this happen.

We are also innovating on the perception stack that we have in the radar. So we know that AI revolutionized. How people are doing image processing -- digital image processing. And we believe that the same thing will happen in radar. So today, most of the radar processing stack are based on a model-based approach. But there are much more you can get from the data if you use -- if you employ these techniques on the rate of data.

But we also understand that these techniques were developed for different applications will have to be adapted, and we are working on it and we have many projects internally that we are trying to understand what is the best way to run AI techniques and AI infrastructure on radar data. And we see ourselves -- we should be the leader -- leaders in AI for radar.

So with that approach, we want to get the best out of the radar, the best radar perception, the best radar-only stack, I would say. But we say, okay, we are taking the radar to the edge. Now we want see what it does to the overall system.

So we are also working on a fusion application. So when you have the full sensor stack of the car, it really have a radar and cameras, that’s for sure. And we want to see how much the radar really adds up to the camera.

We know that in, like, poor lighting conditions, it’s going to be critical. So we are now working on this application for fusing our radar data into a camera stack and showing really that if you use the radar plus camera, you get -- you solve the problem. Basically, you will solve the perception problem and you can make highly reliable decisions in all scenarios.

Finally, we are -- to really make all of this product. With a product, we have to embed it into our next-generation chipset. So, of course, it’s always a good for us and this is what we will do in this next-generation. We will further improve the physical resolution to native resolution and we are going to take the AI on the edge approach in terms of radar processing.

So we got to add all the necessary IP in order to run AI algorithms on the radar point cloud, on the radar data already on our processor, so people can run their algorithm there and not put everything in a central computer, which can increase the cost of the central computer and it can be a non-scalable approach. So by putting these AI cores on the edge, it creates the flexibility and -- system flexibility and the scalability to improve performance in the future.

We also want to enhance our software-defined radar capabilities. So, today, we can update software and change the way the radar operates on the fly. And this is an ability that we see. It’s very important in the market to make the sensor future proof. So imagine that the car manufacturer took radar, which is 10 times better than any other radar on the market is put it on the car, but it still needs experience in order to use and leverage this information. So it can update. It can make the future proof by updating software and exploiting the radar information better. And with this approach, we are also aligning ourselves to the market and supporting better our Tier 2 model by improving our software-defined radar capabilities.

The most important thing, of course, is reducing costs, because we know automotive industry is very cost-sensitive and in the next-generation we are looking for ways to further reduce the cost, have better integration, put more channels on the same silicon and reducing the cost and power even further.

One of the ways that we are going to do it is we are going to introduce a high-channel count transceiver. So today we have a transmitter and receiver chips, which was a good technical decision at the time. In the next-generation, we are already looking on how to put them together and use -- and provide a high-channel count transceiver and this will also help improve performance and reduce the cost in the system. So this is how we see the road ahead of us and we are very excited about the future.

So thank you everybody for listening and now Kobi back to you.

Question-and-Answer Session

A - Kobi Marenko

Now we will be happy to take your questions. First question will come from the analyst from Gary Mobley of Wells Fargo. Hi, Gary?

Gary Mobley

Hey, guys. Thank you for taking my question. I want to pick up where you just left off in your prepared remarks and talk about the future architecture in the competitive environment. Do you see a situation where automotive OEMs like Tesla, for example, have -- may have the capability to develop high definition radar solutions in your mind? And do you see a path towards maybe some of what your competition is doing in that is centralizing, using camera image, sensing -- a data sense in the centralized processor or [inaudible] control or do you still secure in terms of edge processing into radar specifically?

Kobi Marenko

So thank you for these questions. So we believe that the amount of data that is generated from a 4D high-resolution Imaging Radar cannot be processed not on the edge. Just to take this data, the theoretical data of 1 tera and move it to the central compute is unrealistic.

The same with OEMs that will develop their own chipset. This doesn’t make sense and there’s no economic scale of that. We are working -- we have our own processor. We believe that our core IPs, our process and our next generation of the processor. We will give to the OEMs much better abilities to develop their own stack, their own kind of radar on top of it, and of course, to the Tier 1s, but we don’t see the OEMs -- we haven’t heard even from one OEM that they want to develop their own chipset. They might have -- they might want to go directly to the Tier 2 for making sure that the supply will be there, but not for development.

Gary Mobley

Thank you.

Karine Pinto-Flomenboim

Now we will take questions from Josh from Cowen.

Kobi Marenko

Hi, Josh.

Josh Buchalter

Yeah. Sorry and thank you for taking my question. I appreciate all the fact you made there, but I just want to ask for financials really comparative the original expectations, 2023 is coming in materially softer than your original forecast. I think 2023 was originally driven by ramps of Chinese customers and also robotaxi, can you speak to what’s driving the softer growth outlook versus original expectations and then, in fact, more importantly, [inaudible]?

Kobi Marenko

So basically as we see it, we are in a shift of around three quarters from our original plan two years ago. This shift was caused mainly by the supply chain problems that slowed down our production time line from one end and from the other end it also slowed down the decisions in -- on the OEM side of it.

Also, robotaxis and full autonomous driving is slower than expected and forecasting that we got from our customers two quarters ago or even one quarter ago, basically slowing down. But I think that the good news is that, the supply chain issue problem is behind us and behind our customers and China is trying to close the gap rapidly. And we believe that in 2024, we are going to be able to be more or less where we wanted to be before and to close this gap and to narrow the gap, and of course, in 2025 and 2026 as well.

We already have a preliminary order for end of 2023, early 2024. And this is just the first order that is around $30 million worth and we believe that in the next coming two quarters, we will have a few more orders that will give -- will build us the 2024 forecast and the 2024 expected revenues.

Karine Pinto-Flomenboim

To add to Kobi, Josh, it’s a little bit also goes together with the language that we wrote. That was -- we wanted to focus this coming quarters on serial production and to aim our revenue to that pace, which will actually give us a more potential reoccurring revenue in the future and to increase our customer base more solidly and to have the customer as we see it now that they endorse our technology.

Josh Buchalter

I will take that offline [inaudible].

Kobi Marenko

Hi, Matthew.

Matthew Galinko

Hello. Can you give -- can you just touch on the OEM repositioning [inaudible] I think in your presentation slide, the OEMs is now are evolving customer stack again supporting and become more comfortable over time. So there [inaudible] is the idea in the sensor the primary sensors or [inaudible] regard and camera we expected to [inaudible]?

Kobi Marenko

Yeah. So I think this is a great question. The -- what we see from the OEMs is a very strong desire to make their hardware ready for the features coming with the autonomous driving Level 2+ going towards Level 3 and make sure that their hardware is capable of processing and being able to sense the environment, and for that, you need the Imaging Radar as an independent sensor compared to the camera.

So you have the camera, you will always have the camera and in order to have a safe features, or for example, driving in the highway, driving in a traffic jam, you need another independent source of image, and for that, you must have a high-quality, high-definition Imaging Radar, like the one that we are providing the chipset for.

Matthew Galinko

Okay. If I could just ask a quick follow-up. You mentioned 12 of the top 15 automakers you have been engaged with, for the others engage with advance radar competitors or are they radar skeptics at this point or why isn’t 15 of the top 15 you have?

Kobi Marenko

Yes. So the country as they are doing 4 by 3, 6 by 8, but you are right that some of the basic Imaging Radar are going towards the 12 by 16, but this doesn’t give enough information for the perception stack to really understand the image around us. And only when you go to the 48 by 48 or the higher -- much higher channel count, like also some of our competitors announced that 2,300 channels and few thousands of channels, only then you really get the information that the perception team needs in order to drive safely with those features.

Ram Machness

But to answer the question, we believe that the other three are working with the image -- with the low-end Imaging Radar, like, we call it.

Matthew Galinko

Thank you.

Kobi Marenko

Jaime?

Jaime Perez

Hi. Good day, everybody. Thanks for taking my question. I have more of a technical question. Announced presentation, you mentioned that, you have multiple frequency, is that multiple frequencies against other radar systems or frequencies that are image from other components. So just curious on how that progressing?

Kobi Marenko

The multiple frequencies, yes, it’s for making sure that we want to be interfered with other radars and also with our radars, because sometimes there is more than one radar per car. So that is -- we are changing the frequency of, hoping on the frequency in order to make sure that we are all -- that all of the environment can live together without interfering each other.

Jaime Perez

Okay. And my second follow-up is more of a financial question. I know you mentioned you are going to go into production in the fourth quarter. What are the components of revenues this year? Is it going to be valuation sales or testing sales?

Karine Pinto-Flomenboim

So we are focusing on, of course, as I said, chipset production, but it will be back-end loaded towards the end of the year. During the year, we will still have small amounts of also non-automotive -- small amount of chipsets of non-automotive customers and also additional small volumes for our known customers.

Kobi Marenko

But the major part is the production of our chips and the first…

Karine Pinto-Flomenboim

Which is -- yeah.

Kobi Marenko

… production chips that we have…

Karine Pinto-Flomenboim

Plan…

Kobi Marenko

… preliminary order for that.

Jaime Perez

Oh! Okay. That’s all the questions I have for now. Thanks.

Karine Pinto-Flomenboim

Thank you, Jaime.

Kobi Marenko

Thank you. Okay. We will have some more questions from the audience.

Karine Pinto-Flomenboim

Audience. Yeah. How is our bid different or complementary to Mobileye?

Ram Machness

So both Arbe and Mobileye announced that the direction is for high channel count radars based on what the technology is called FMCW, so in that sense, we are very similar, trying to solve the problem of having a good sensing for perception.

Arbe is something -- the chipset right now Arbe is going to be in production this year in the fourth quarter and that’s ahead of the competition. We are working already today with the four Tier 1s that we already announced and in pipe with others as well, as well as other OEMs that are working right now to integrate this technology into their vehicles.

Kobi Marenko

Next question.

Karine Pinto-Flomenboim

Next question from Billy [ph]. How close is the next best competitor to your product performance levels?

Ram Machness

So if we put aside the very high channel count like what Mobileye is doing and Arbe is doing, the next one are the 12 by 16 that are basic Imaging Radar. They are far behind in terms of the channel counts. It’s 192 channels versus 2,300, it is 10x factor comparing these two kind of solutions and the results are accordingly.

So the level of details, the level of false alarms that will cause phantom braking the level of misdetection that you get with the low channel count is dramatically higher and the performance that you are getting with the Perception Radar is dramatically higher and better and required to get an image of the surrounding, of the stationery objects of defining and finding the obstacles around you.

Kobi Marenko

Just to add to what Ram said, basically we are providing the same performance -- the 10x more performance, but on the same price as our next best radar. Next?

Karine Pinto-Flomenboim

Several investors had questions. What is the Tesla buzz all about?

Ram Machness

So we can only refer to OEMs that allowed us to say their names, but we are hearing a lot of requirements and global trend of OEMs going into Imaging Radar and Perception Radars with the high channel count and that’s across about -- across the market, across most of the OEMs today.

Karine Pinto-Flomenboim

Thank you. Question from Robert. Are other Chinese EV manufacturers testing using your solutions?

Ram Machness

Yes. So the answer is absolutely yes. There are others that are right now working to integrate radars based on our chipset into their solutions, their vehicles.

Karine Pinto-Flomenboim

Last question. Kobi, what keeps you up at night?

Kobi Marenko

So, of course, except of the fact that we are on a final days of where, sorry, final months of production, what really keeps me up at night is how we are staying motivated for innovation, making sure that side-by-side with taking those chips to production, supporting our customers, generating hundreds of millions of dollars from the current product, we are still doing the innovation like an early-stage start-up with our next-generation product.

Kobi Marenko

Okay, we are grateful for your participation today and we appreciate your ongoing support as we strive to push the boundaries of innovation in the industry and become the leader in Level 2+ and Level 3 advanced perception.

To our valued employees and partners, we extend our sincerest thanks to your commitment to Arbe. It is your hard work and dedication that propels us forward to our goals. We are excited about the opportunities that lies ahead and we are committed to keeping you updated on our progress.

Should you have any questions or want to discuss potential collaborations, please do not hesitate to reach us at investors@arbe.com or visit our website to schedule a meeting. We look forward to hearing from you. Thank you, all.

Karine Pinto-Flomenboim

Thank you.

For further details see:

Arbe Robotics Ltd. (ARBE) Q4 2022 Earnings Call Transcript
Stock Information

Company Name: Arbe Robotics Ltd. Warrant
Stock Symbol: ARBEW
Market: NYSE

Menu

ARBEW ARBEW Quote ARBEW Short ARBEW News ARBEW Articles ARBEW Message Board
Get ARBEW Alerts

News, Short Squeeze, Breakout and More Instantly...