Radically accelerate your roadmap with Tangram Vision's perception tools and infrastructure.
Table of Contents
While Cruise’s robotaxis have been taking all the recent self-driving headlines (what with running away from the police, or collectively getting bricked and snarling San Francisco traffic), Waymo has quietly rolled out their own driverless robotaxis in San Francisco as well.
And, at this year’s TechCrunch Sessions: Mobility conference, we had the benefit of staring at one of Waymo’s autonomous Jaguar I-Paces for two entire days, as it was…five paces from the end of the Tangram Vision booth. Given the ability to inspect a stationary Waymo I-Pace in detail, we did just that, which means its time for another Tangram Vision Sensor Breakdown - Waymo edition!
In this breakdown, we’ll go front to back, and bottom to top. We think you may be surprised by a few things we found, including hidden sensors, what we believe to be a dual use approach to the Waymo I-Pace’s sensor arrays, and a potential upgrade to a popular Google service that could be enabled by the Waymo taxis.
So let’s begin at the beginning and at the bottom, at the Jag’s front bumper.
The first sensor we see is a spinning LiDAR, an evolution of Waymo’s Laser Bear Honeycomb. This design is notable in that it has been created specifically to slough off contaminants and occluding materials when in operation. This is important for robust operation in any on-road environment, whether it be an interstate highway, or a San Francisco street. The orientation of this LiDAR is fairly near to vertical, with a slight downward tilt. As the original Laser Bear Honeycomb had a 95° vertical field of view (VFOV), we can assume that this updated unit has the same or better. In this position, this will allow this LiDAR to collect street-to-vehicle height data in a 180° arc, starting at 0m and stretching out to as much as 300m. This is tremendous for a self-driving vehicle and allows the LiDAR units to operate as part of a robust safety system to protect pedestrians, cyclists, and other drivers. However, given this LiDAR’s low placement, the other LiDAR sensors present on the Jag, and the low speeds to which these robotaxis are limited, we believe that this LiDAR unit is likely tuned to operate in a much shorter range, and is primarily used for near field obstacle avoidance.
Now, the LiDAR unit is not the only sensor we see in the picture above. We also see a camera of some sort, and it is surrounded by four LEDs. Our best guess is that this is not a pure RGB camera, but rather an RGB-near infrared camera, and those LEDs are infrared floods to aid in capturing additional detail. We can see that this camera and the LEDs are angled downwards by around 20° so that they are capturing the road surface immediately in front of the Waymo Jag. But why? What for? Our best guess is that this camera is fused with the LiDAR to provide higher resolution imagery (where possible) to supplement the three dimensional data coming from the LiDAR for an added layer of safety. Why RGB-IR? Because RGB cameras (even HDR cameras) can lack detail in dark operating conditions, which RGB-IR can capture. Therefore, we get visual data in nearly any lighting condition.
Still on the bumper, but off to the side, we see two of the four ultrasonic sensors that are fitted to the I-Pace. These are part of the Jaguar’s standard parking aid system (in the past, made by Valeo), and are likely accessible via the CAN-BUS for Waymo’s engineers to integrate into their sensing system design. These will provide relatively coarse near field proximity data that can be used as a fall back in low- to no-light conditions in conjunction with the LiDAR sensors and cameras for object detection and obstacle avoidance.
Taken together, the six sensors noted above all provide a high level of data and redundancy for the purpose of avoiding collisions with other cars, pedestrians, cyclists and other roadway users at low speeds. With the exception of the LiDAR, though, they don’t provide the same protection at higher rates of speed. For that, we need to move from the front bumper to the front fenders.
As we reach the fenders on the sides of the car, we see a similar LiDAR unit to what we saw on the front bumper, but now mounted on a separate pod on the Jaguar’s fender. But it’s not alone. There are three other sensors as well!
Boy, is there a lot going on with each of these pods! Let’s break it down from the front to the back. At the very front, we see a cavity that is pointed towards the Jag’s typical direction of travel - forward. Within this cavity, we have another camera. In this case, it is a wide vision HDR RGB camera that works in conjunction with the other forward facing sensing elements to provide high resolution color data that can be used at higher rates of speed. It’s also mounted just above the level of the headlights, providing excellent coverage for objects the size of a small dog to a large truck.
Right next to the forward facing camera, and pointed from the side outwards, we find another wide vision HDR RGB camera. Our belief is that this side camera joins with the forward facing camera to form a stitched, 270° high-resolution view of the Jag’s surroundings going forward. Why 270°? The I-Pace has a 20.25’ turning radius. Therefore, these cameras can work together to provide high-resolution imagery of any object or obstacle within that turning radius. They are also likely fused with LiDAR point clouds and radar point clouds as well for even more data.
Speaking of radar, we see our first radar unit just behind the cameras. This radar is pointed to the side, providing a three dimensional view of mid- to long-field objects and obstacles. Given that Waymo claims full 360° radar coverage around the I-Pace robotaxi, we assume it has a near 180° FOV so that its output can be stitched with the forward and rear facing radars also on the Jag (more on that later).
Behind the radar, we return to the camera/LiDAR array that is very similar to what we saw at the front of the Jag. We believe this LiDAR is used for near-field perimeter obstacle detection. In particular, it is likely used, along with the camera, to sense curbs, adjacent vehicles, and other roadway occupants like cyclists, pedestrians, and animals.
Going further down the side of the car, we found something that most people may have missed until now. Hidden in the bottom of the Jaguar’s mirror, we see a small RGB camera, one on each side. Now, to be clear, these cameras are an optional accessory for the Jaguar I-Pace, and they are again used for park assist purposes (in this case, to give the driver a full 360° view of the car’s surroundings). But the data (not the video stream) from these cameras may be available via CAN-BUS to Waymo’s engineers, who could use this camera's signals for lanekeeping or curb sensing purposes as well. And one must wonder - why bother to pay for an optional camera if you don’t plan to use it?
Before we reach the roof of the I-Pace, let’s finish cataloging the sensing we find below the window line of the Jag. As we near the rear bumper, we find a sensor pod that appears very similar to the ones that are mounted at the front, but with a notable exception. These pods only contain cameras and radar, and no LiDAR unit. Not only that, but they are angled to capture data at the car’s rear corner, and not its side. This informs us that the cameras and radars in the similar pods up front likely have wide fields of view to capture the necessary data to create stitched 360° camera and radar views around the perimeter of the vehicle.
Moving to the rear of the car, we find a final combination LiDAR/camera/LED flood unit, similar to what we saw up front, as well as more ultrasonic sensors. Again, we believe that these units are largely tuned to near-field use cases of object detection and obstacle avoidance, with capabilities to capture both 3D and visual data in a wide array of lighting conditions.
Up to this point, why have we only reviewed the sensors on the body, and not those on the roof? Well, that’s because we think that the sensors on the body serve a distinct purpose that is completely separate to those on the roof. Our belief is that the body-mounted sensors are first and foremost for level five autonomy functions: obstacle avoidance, object detection, and navigation. We believe that the majority of the roof mounted sensor array is primarily used for something entirely different: building the high-resolution 3D maps that these Waymo robotaxis use for navigation. And possibly then some. We believe that these roof mounted sensor arrays may also be being used to build Google’s next generation of mapping and street view products, in dense 3D. Let’s get into them!
At the very front of the array, we have the sensors that likely aren’t just for mapping, but rather for L5 autonomy needs like navigation, object detection, and obstacle avoidance at mid- to far-range. We have two radar units that complete the 360° perimeter of radar view around the I-Pace. Inboard from the radar units, we have a pair of wide vision HDR cameras behind protective lenses. And, if you zoom in close enough, you can see that the bump just to the side of the lens hides a tiny cleaning device to keep this lens clear. Whether it is a tiny squeegee, an air jet, or a washer jet, we can’t quite tell. But it’s a neat little feature! These cameras are supplemented by a central camera unit. Along with the pod-mounted cameras, these complete the 360° perimeter of HDR camera views around the I-Pace.
As a final note, we see the Jaguar’s factory-placed lane detection camera embedded in the windshield just below the Waymo sensor array. Like every other Jaguar-supplied camera, this camera's data, but not the video stream, should be available via CAN-BUS for Waymo’s engineers to integrate into their total sensing package. They may simply use it as Jaguar does in the standard I-Pace - for lanekeeping.
And with that, we’ve completed the sensor array for navigation, object detection, and obstacle avoidance. To summarize, on the car and at the front of the roof, we have:
Of course, there are a few others that we’re not counting (encoders or hall sensors at the wheels; steering angle sensors; throttle and brake angle sensors; IMUs; GPS; etc.), but those aren’t visible, and we’d simply be making assumptions as to what does or doesn’t exist.
Otherwise, that’s a very thorough level 5 autonomy package, with 32 sensors total, providing 360° of coverage from three distinct modalities! This sensing package is among the most comprehensive we’ve evaluated for autonomy, and shows Waymo’s desire to optimize for safe operation, with redundant systems, layered coverage, and environmental robustness.
Now, one quick sensor note before we move onto the mapping package. There are a couple more little sensors that get an honorable mention. They’re not for any of the autonomy or mapping purposes we’ve noted; rather, they are for sensing an approaching rider.
Over each rear passenger door, we see this ultrasonic sensor that can unlock the door when the correct rider approaches:
And over the rear hatch, we have a similar pair of ultrasonic sensors that can open the hatch for luggage when the correct rider approaches:
Neat, right? It’s a simple, elegant solution to ensuring that the vehicle remains secure until the right occupant is in proximity.
Okay. NOW, 38 sensors later, we can get to the mapping package. So what have we got?
What we have is long-range multi-channel spinning LiDAR fused with a stitched 360° camera array. Based on the size of the LiDAR’s protective dome, as well as the sophistication of Waymo’s other LiDAR units, our assumption is that this LiDAR may have as much as 128 channels, for an ultra-high-resolution three dimensional representation of the environment that can stretch out to hundreds of meters.
The visual spectrum part of this system is achieved through six pairs of cameras (for a total of twelve cameras). In each of these camera pairs, one of the cameras appears to be high dynamic range color, while the other looks to be infrared. Both are placed behind glass with bandpass filters, and a mechanical washer and wiper system keeps them clean from debris. We got a nice closeup shot of this system, which you can see below:
Together, this ultra-high-resolution LiDAR and the twelve camera array can be fused together to create a dense 360° 3D map of anything that the Waymo robotaxi passes. And with a fleet of dozens of taxis covering thousands of miles a day in San Francisco, that map can be refined with many many inputs to extreme accuracy and detail — in our estimation, accurate and detailed enough to be used for a supercharged 3D version of Google Maps and street view! One intriguing possibility with the HDR cameras and IR cameras? 3D Google Maps and street view…with nighttime mode! Mark our words - you saw it here first! If our speculative product roadmap predictions turn out to be incorrect, we still stand by our analysis that these sensors are strictly for mapping, and not for autonomy. Waymo will benefit from having their visual maps constantly updated with additional views and updates to have the most robust maps from which their cabs can operate.
And with that, we’ve completed the sensor array for mapping. To summarize, we have:
When we add these thirteen sensors to the others we’ve already noted, we arrive at a grand total of (drumroll please)…51 sensors. 51! From a cost perspective, we’d have to guess that this sensor array costs Waymo upwards of $40-50K for each vehicle, not including the other sensors we could not assess, or the additional onboard compute, wiring harnesses, etc required to install and run it. And we should note that the LiDAR units are manufactured by Waymo, so costs would likely be even greater if sourced from a third party vendor.
Given the type of company we are here at Tangram Vision, we also need to address an elephant in the room: calibration. How do you achieve initial calibration for 51 sensors, and more importantly, how do you maintain calibration for 51 sensors in a challenging environment like an urban street? The answer is you don’t.
Well, that’s not exactly true. Here’s what we think is going on. To begin with, we don’t believe that all 51 sensors are calibrated as a single array. Rather, we think that there are separate calibration processes based on the two different roles we’ve noted (autonomy or mapping), and that some sensors remain loosely coupled given their secondary or tertiary roles in Waymo’s sensor array.
During production, the six radars, four LiDARs, and nine of the the cameras used for autonomy are calibrated on a 360° turntable for a tightly coupled sensor array typical of an on-highway autonomous vehicle with L5 aspirations. The Jaguar-supplied ultrasonic sensors and cameras are likely used as secondary and tertiary data sources in a loosely coupled fashion. The ultrasonic sensors for occupant awareness are not calibrated to this system at all.
These body-mounted sensors present challenges for maintaining calibration during operation, given that body panels can flex. In the case of the rear mounted LiDAR/camera array, it’s even trickier as it is attached to a body panel (the rear tailgate) that moves regularly! Therefore, Waymo has likely developed an online calibration scheme to keep these units calibrated during operation, with an occasional turntable recalibration in a dedicated facility to ensure accuracy and precision over time.
During production, the LiDAR and twelve cameras are calibrated on a 360° turntable for a tightly coupled sensor array for dense, accurate 3D maps. Given the highly rigid structure to which these sensors are attached, it is unlikely that calibration will drift noticeably during operation. With that said, Waymo likely still applies an online calibration scheme here as well to maintain accuracy during operation. In addition, this array may be able to access data captured by the other I-Pace’s mapping arrays to compare its own captures, and adjust appropriately based on deltas from expected values.
Of course, Waymo wasn’t the only company that brought sensor-equipped vehicles to TechCrunch Sessions: Mobility. While Tangram Vision’s classic Mini Cooper sensor test bed lacks some of the sophistication of Waymo’s robotaxi, we believe we win hands-down on style (honorable mention: Faction’s ElectraMeccanica test vehicle). And, like Waymo, we have a few surprises of our own. Stay tuned for Tangram Vision’s next sensor testing vehicle — it will be even more interesting than the Mini. You can do so by subscribing to our newsletter, following us on Twitter, or following us on LinkedIn. See you down the road…autonomously!