See how the Tangram Vision Platform can radically accelerate your perception roadmap.
Table of Contents
When we first launched the Perception Industry Market Map two years ago, we were still in the midst of a record breaking run of private funding for exciting deep tech companies. It seemed like every day, a new LiDAR company was making headlines or a new open source perception project rose to the front page of Hacker News. But now, with venture markets tightening, layoffs rising, and mergers increasing, what has become of our market map and its 75 participants? As it turns out, it has swelled up to 94 participants. It seems that our industry has never been healthier!
As we stated two years ago, our intention with this industry map is to paint an easily digestible picture of where the perception industry stands at this point in time. It shows the primary players in the key perception sensing modalities, as well as the software companies that provide the infrastructure and libraries that are most commonly used to build perception-powered products.
Tangram Vision also offers other great free resources for anyone building a perception-powered product, including:
So without further ado, let’s dive into these categories, and the near 100 companies that merit inclusion.
The ultrasonic sensing landscape is dominated by incumbents from the automotive world. The companies listed in the industry map are:
Notable additions to the 2023 map include:
Looking to learn more about ultrasonic sensors? Check out our blog post on how they work, where they’re applied in autonomy, and explore a few notable picks.
The depth sensing landscape includes large incumbent players, but has also seen a significant amount of startup activity over the past decade. The companies represented on the industry map are:
Want to go more in depth with depth sensors? Check out our depth sensor visualizer to compare different depth sensors’ ranges and fields of view, or read our depth sensor roundup.
In the industry map, we consider the two primary modalities of LiDAR: scanning and solid state. Scanning LiDAR units utilize a laser emitter and receiver spinning rapidly to capture a 360° view of what is around the sensor. Newer developments in scanning LiDAR also use MEMS mirrors or solid-state beamsteering to achieve similar results with less mechanical complexity. Solid state LiDAR units use a fixed laser emitter and receiver with a wide field of view to capture large amounts of data with no need for mechanical parts.
The rapid growth of the LiDAR market tracks the ever growing interest in autonomous vehicles and mobile robots. However, slower-than-expected development of the AV and robot markets has created a ripple effect into the LiDAR industry, resulting in many startup failures and lots of M&A activity. The survivors of this first wave of failure and consolidation are becoming dominant incumbents for scanning LiDAR. A new wave of solid state LiDAR companies is emerging as well, many with a focus on ADAS applications in pursuit of large automotive contracts. In turn, many of the scanning LiDAR incumbents are launching their own solid state LiDAR units to compete.
Notable changes to the 2023 market map:
Looking to learn more about LiDARs, both scanning and solid state? Check out our LiDAR sensor roundup, which includes a comparison table showing ranges, fields of view, angular resolutions, and more.
Perception systems often rely heavily on visual data; however, visual data isn't always available (for instance, foggy and unlit scenarios). Thermal cameras can be an option to provide sensory information when visual sources fail. For our 2023 market map, the number of significant thermal sensor vendors has increased by 200 percent!
Looking to learn more about thermal cameras and how they work? Check out our thermal camera roundup.
Like the scenario described above with thermal imaging, radar can provide sensory data when visual sources fail. Consider, for instance, an autonomous vehicle that relies on LiDAR and cameras in dense fog. Because the fog occludes any visual references, that AV won't be able to operate. With radar, however, it could. Radar has been deployed effectively in automotive settings since Daimler Benz first launched its Distronic adaptive cruise control system in 1999.
The most ubiquitous perception sensing modality are cameras. Cameras use the same imaging technology that is found in mobile phones, digital cameras, webcams and most other camera-equipped devices. There's a huge diversity of available configurations, spanning resolution, dynamic range, color filtering, field of view, and more.
In perception applications, cameras can be used on their own, with each other, or with other sensing modalities to enable advanced capabilities. For instance, a single camera can be paired with a machine learning library to perform classification tasks for a bin picking robot. Two cameras can be used in a stereo configuration to provide depth sensing for a robot, or even an adaptive cruise control system on an AV. Cameras are commonly paired with IMUs to enable visual-inertial navigation for robotic platforms as well. In other words, perception systems almost always include cameras.
Depending on system design, cameras can be directly embedded into a device, or externally attached. The former is common for mobile robots and autonomous vehicles, while the latter is frequently found with machine vision applications for industrial robotics.
While there are hundreds of component and module vendors, we have chosen to include the most significant participants in the perception sensing world.
Another non-visual sensing modality, IMUs (aka inertial measurement units) sense kinetic motion. Taken alone, this information can only provide the perception of random movement in space. In concert with a visual sensor like a camera or depth sensor, a visual-inertial sensor array can provide highly accurate location and movement data.
Like cameras, IMUs can be supplied as embeddable modules, or as external units.
For the past decade, the perception world has been abuzz with interest around neuromorphic cameras (also called "event cameras"). These cameras work on an entirely different principle than other cameras. Instead of capturing and sending an entire frame's worth of data, these cameras send data on a pixel-by-pixel basis, and only for those pixels where a change has occurred. As a result, they can operate at much higher speeds, and can also transmit much smaller amounts of data as well. However, they are still difficult to source at scale, and expensive. To date, there has only been one consumer application of event cameras, and it was not a success. Time will tell if these experimental cameras escape the lab and become a regular part of real world perception systems.
To learn more about event cameras, how they work, and how different models compare, read our summary here.
A perception system is more than just sensors; it's also the software required to do something with that sensor data. Sensor software can operate at lower levels in the stack – for instance, Tangram Vision's software ensures that all sensors are operating optimally and transmitting reliable data. It can also operate at the application level – an example would be using Slamcore's SLAM SDK to power a navigation system for a warehouse robot.
Within the perception software ecosystem, there has been heated competition among SLAM providers over the last decade. This has been encouraged by increased investment in both robotics, as well as XR technology like virtual reality and augmented reality. Consolidation has occurred, with many teams being acquired by the FAANG companies as they plan for future consumer products like AVs and VR headsets.
Notable changes to the 2023 market map include:
Even in an industry rife with consolidations and shutdowns, it appears that growth is still the dominant trend. Yet, like most others, this growth wasn’t apparent to us. We’ve been paying attention to the bad news and failed to properly recognize the good news.
So here’s to the 94 companies on this year’s map that are contributing to that growth story. As we said two years ago, the perception industry is only at the beginning of its growth curve.
Like any nascent industry, there will be setbacks and unexpected events. But, for now, we feel confident in predicting that the next time we update this map, we’ll exceed 100 participants.
If you enjoyed reading this post and viewing the market map, please share it with your colleagues and on social media. If you've got suggestions for companies to consider for inclusion, please tweet at us and we'll add them to our tracker!
Thanks, and stay tuned for future updates.
<a href="https://www.tangramvision.com/blog/the-2023-perception-industry-market-map" title="2023 Perception Sensor Industry Map"><img src="https://uploads-ssl.webflow.com/5fff85e7f613e35edb5806ed/64243ba21d52fb4401f45056_Tangram-Vision-Perception-Sensor-Market-Map-2023-Final.png" width="100%" style="max-width: 850px;" alt="2023 Perception Sensor Industry Map"></a><br>Provided by <a href="https://www.tangramvision.com" target="_blank">Tangram Vision</a>
The Tangram Vision Platform lets perception teams develop and deploy faster.