See how the Tangram Vision Platform can radically accelerate your perception roadmap.
Table of Contents
Imagine driving a car without windshield wipers. After a few days, it may become difficult or even impossible to see well enough to drive safely.
Now imagine deploying a sensor-enabled platform like a robot or a drone with no method in place to clean the sensors that allow it to see and navigate. See where we're going?
Sensor cleaning is a thing. But it is a thing that is often addressed relatively late in the development cycle for a sensor-enabled platform. After all, lab environments don't present many of the environmental conditions that can flummox a sensor in the real world. So let's talk about what kinds of contaminants and debris can degrade sensor performance, and how to plan in advance and during deployment to keep sensors working reliably and consistently.
The number one contaminant we'll consider is dust. Yes, your sensors may be IP67 rated against dust, but that rating only applies to dust intrusion in your sensors' body. Dust can still settle over sensor projectors and camera lenses, creating occlusions, refractions and laser wavelength shifts.
Sensor-enabled devices are often deployed in very dusty environments. Consider construction sites, farms, and warehouses. In each case, significant quantities of dust are generated as a result of the activity being performed in each of these environments.
Dust comes in many forms, depending on the environment. In terms of dust that can impact sensors, we are primarily concerned with two factors: abrasion and electrostatic stickiness.
Many sensors rely on lasers tuned to a specific wavelength. For instance, structured light 3D sensors use a laser projector in the 800-900nm wavelength range, with a frequency-matched camera to read the laser pattern distortion to create a 3D model. The optical elements in a system like this include a special chemical coating on the exterior of the optical element that acts as a bandpass filter to only let in the specific wavelength of laser light used by the projector and the camera. If a 3D sensor is used in an environment with highly abrasive dust (for example, construction sites that generate lots of silica-rich concrete dust), these chemical coatings can suffer from abrading over time, that then causes the 3D sensor to fail.
Consider a high dynamic range (HDR) camera that is mounted to a strawberry harvesting robot. Because the strawberries are grown outside, HDR is used to ensure that the system can effectively gauge ripeness by color under a variety of lighting conditions. Yet if that HDR camera becomes sufficiently occluded with electrostatically-charged dust particles from the surrounding soil, it may no longer provide accurate chroma readings, and the system will fail.
This is a slightly nicer way to describe splattered bugs, tree sap, and bird poop, among others. These contaminants are common challenges for any autonomous platform, including autonomous vehicles (AVs), drones and service robots that work outdoors.
A bullseye hit from a friendly finch can stop a sensor from operating until the offending mess is cleaned. Similarly, an AV that travels at highway speeds may collect enough bugs that its sensors' performance can degrade to unacceptable levels.
Many of these organic contaminants are particularly tacky, which means they require more aggressive cleaning using chemicals and/or mechanical methods to remove.
Your fingers may look clean, but a fingerprint left over a critical sensor can degrade performance, although it will rarely stop operation altogether. Fingerprints, like other organic matter, contain components that are very tacky and difficult to remove without using cleaning agents.
For any sensor-enabled platform that works outdoors, water and water mixed with other contaminants (salt spread on roads, for instance) can prevent data capture — including after it has dried, if it leaves particulate residue that was originally suspended in the now dried liquid.
Because of the proliferation of AVs and advanced driver assistance systems (ADAS), there are a number of automotive OEMs and startups that are working on sensor cleaning systems that address liquids and the contaminants that they contain.
In some scenarios, contamination build-up will be slow enough that periodic manual cleaning will be an appropriate approach to ensuring optimal performance. In others, however, automated or designed-in cleaning approaches will need to be considered.
As noted above, the most troubling dust for sensors is abrasive dust. Fortunately, abrasive dusts often do not present strong electrostatic forces, and can often be cleared with strong jets of compressed air. This can be done manually with a camera lens air blaster of the type typically used for high-quality photography lenses.
Another option is a soft bristle brush, like a camel hair brush. These gentle, natural bristle brushes use their own electrostatic charge to pick up and carry away fine dust particles from lenses with minimal risk of damage. Make sure to occasionally clean these brushes to avoid dust build-up. Because of the electrostatically-charged nature of these brushes, they are also an excellent option for removing sticky, electrostatically-charged dust from sensors.
Exercise caution when using cleaning solutions and cloths. Many household cleaners include agents like alcohols, ammonias and solvents that risk stripping away some or all of the specialized coatings that are often applied to sensors. Cloths that are applied directly to precision sensor elements can leave fine scratches that can impede proper operation. Finally, some cleaning solutions can leave unnoticed residues. If you are unsure what can or can't be used to clean your platform's sensors, contact your sensor manufacturers for their recommendations.
For scenarios where manual cleaning is not an option, consider onboard compressed air. This can be supplied continuously to keep dust from landing on sensors, or it can be supplied on a periodic interval to remove accumulated dust. Keep in mind that compressed air will condense any moisture present in the air, which can cause fine water particles to spray out with the compressed air. For this reason, a water trap or separator is recommended, and will require periodic servicing.
For some sensors, a liquid cleaner can work well. Innovators like SEEVA are creating integrated systems with pumps, nozzles and specialized, sensor-safe solutions to keep ADAS sensors free of debris and operational. Large OEMs like Valeo are creating sensor washing systems, as well. As noted above, make sure to check with your sensors' manufacturers before applying any sort of cleaning solution to ensure compatibility.
In many cases, designed-in solutions can prevent the worst organic matter scenarios that can cause sensor failures. Built-in hoods can prevent bird droppings and tree sap from accumulating on sensors. If you have deep pockets like Google's Waymo division, you can create a more complex mechanical solution like this:
For the above solution, note that the dome covering the optical elements likely does not have any specialized coatings and is also likely replaceable. Therefore, a cleaning solution that includes an agent like methanol can be used to rapidly break down organic matter. Any surface scratches from the wiper action can either be polished away periodically, or a new sensor dome can be fitted.
Preventing bugs from blocking forward facing sensors in autonomous vehicles can be managed in an elegant, passive fashion with air screens:
This is simply a matter of using aerodynamic aids to redirect airflow away from the sensitive sensor surfaces, keeping them clearer for longer periods of time. However, manual cleaning will still need to be performed periodically to remove those bugs that find their way to the sensor. Because of the tacky nature of bug residue, mechanical and/or a specialized cleaning solution to break down the organic matter will need to be used. Check with your sensors' manufacturers for their recommended cleaning approach, or place the sensors themselves behind clear shields that can be addressed with standard cleaning solutions.
While these may seem like the most innocuous of the contaminants mentioned above, they can be among the riskiest to remove. This is because fingerprints are a combination of fluids, including organic oils, that will require a cleaning solution to completely remove. Again, as noted above, many cleaning solutions are incompatible with the specialized coatings that are often present on sensor optics. Likewise, even gentle cloths such as microfiber may still be too abrasive for delicate lenses and coatings, resulting in refractive artifacts that can degrade sensor data. Therefore, the best option is to check with your sensors' manufacturers for their recommended cleaning approach to remove fingerprints.
Thankfully, many liquids have evaporative properties, and will eventually disappear leaving minimal residue. For those that don't evaporate readily or quickly, or that leave enough residue to impact sensor operation, other approaches can be taken.
Sensor manufacturers are now experimenting with built-in hybrid cleaning approaches that use a combination of a specialized cleaning fluid and an ultrasonic transducer to shake loose liquids and suspended contaminants that are otherwise difficult to remove from sensors. Others are innovating with hydrophobic coatings that can slough off liquids rapidly and won't impact optical sensing wavelengths.
When trying to clean liquid and liquid residue off sensors that you have already integrated, the best option is again to check with your sensors' manufacturers for their recommended cleaning approach to ensure you don't use an incompatible approach.
Sensor cleanliness is an important topic that is only now starting to garner broader attention from roboticists and perception engineers. This is partially because of the proliferation of sensor-enabled platforms like AVs and service robots that operate consistently in outdoor environments, but also because of the ever expanding variety of sensors and sensor modalities that are being deployed in new environments as part of new product development cycles.
There are a lot of existing cleaning approaches that can be repurposed from fields like photography where precision lenses and specialized coatings are common. Ultimately, all of us in the robotics and sensor industries should continue to innovate and share best practices for sensor cleaning to ensure that the platforms we create can successfully operate under all real world conditions.
While sensor cleaning isn't something the Tangram Vision SDK addresses yet, there are many other sensor challenges that it does. If you're currently working on a sensor-enabled platform, we'd encourage you to download it and test it for free. And if you have any comments or suggestions for this blog post or any others, please tweet at us to let us know!
The Tangram Vision Platform lets perception teams develop and deploy faster.