Be among the first to streamline and optimize sensors with the Tangram Vision SDK
Table of Contents
This week, we're introducing a new series where we explore some of the more fascinating vision-enabled devices being deployed into the world today. We'll look at the overall industrial design, but we'll focus specifically in our area of expertise: perception and sensing. We'll try and deduce what sensors the device is using, the advantages and challenges of that sensor array, and any areas where enhancements are possible due to the constant evolution of perception techniques, sensors and software.
For our very first post, we placed a very sharp dart in a gripper attached to a. We then ran far, far away as the Kuka flung the dart haphazardly into a wall covered with pictures of potential subjects. It turns out that the first subject chosen by our Kuka is the FarmWise weeding robot. (We apologize to the FarmWise team, as the dart absolutely obliterated the photo of their robot. It then continued through the drywall and exited the building, after which we heard what sounded like a car veering off the road. We may need to reprogram the Kuka).
FarmWise builds robots that work on....farms, obviously. FarmWise's Titan FT35 robot is primarily used for weeding. It can mechanically destroy weeds using blades, and it can also chemically eliminate pests by applying fungicides and insecticides.
The FT35 is one of the larger mobile robots we've seen, about the size of a medium duty commercial truck. It is an articulated unit, comprised of a self-driving robotic tractor, to which the separate weeding unit is attached. The tractor appears to have an onboard gas or diesel powered generator for power. This means that farms that already have gas or diesel fueling infrastructure don't need to add electric charging infrastructure to operate the FT35. It also means that the FT35 can operate for long periods of time over large tracts of land without getting stuck in a far corner with a discharged battery (that said, we see that there are tow hooks at the front and rear of the tractor...smart move!).
The FT35 works around produce like leafy greens. These tend to be relatively short, and easy to plant in neat, tightly constrained rows. They also have distinctive leaves that are differentiated from the leaves or blades of common agricultural weeds, and therefore support classification from a machine learning approach to ensure that only weeds are removed and not valuable crops.
We'll split the FT35 into the tractor and weeder and then determine what kinds of sensors may be employed and for what purposes.
The Tractor looks deceptively simple from a sensing perspective, but we assure you - there's more than meets the eye here. Let's start with the most obvious first — what we can see. After all, that's also what allows the tractor to see.
At the very front of the tractor, a hood shields a single, large camera. This is angled down to capture imagery of what is immediately in front of the tractor. Agricultural implements are used at all times of the day - sometimes before dusk, sometimes after dawn (hence the title of Neil Young's excellent album "Harvest Moon"). Therefore, this tractor needs to operate in all sorts of ambient lighting conditions. Therefore, we believe that the large camera up front is a high dynamic range (HDR) CMOS camera that can work in low-light and direct sunlight conditions. We think its data informs an object detection system that primarily looks for two conditions: "crops" or "no crops". Where it senses "no crops", it steers the robot so that the tires travel in the "no crops" zone to minimize damage to the crops. This camera likely also provides obstacle avoidance so that the tractor will stop if an unforeseen obstacle enters its path. We do not believe that this camera is used for any kind of mapping, given the highly structured nature of modern agriculture.
Just to the left of this camera, we see a white cylinder emerging from the roof. This looks to be a GPS antenna. This would connect the tractor to a satellite that provides it with location coordinates so that it knows precisely where it is in the field. The tractor will use this data to determine where a crop row starts, and where it ends, as well as to understand how far it is from critical infrastructure needs, such as a refueling station, or a dumping bin where it will dispatch the weeds it has collected.
One note on the above image: we see a black box to the left of the imager. This appears to simply be a floodlight that can be used in low-light conditions to aid the HDR camera in capturing visual data
Beyond this frontal camera and the GPS receiver, there really aren't many additional sensors of note on the exterior of the tractor. There's just one more sensor-like apparatus that we see on the exterior - it's that little black dome you see peeking above the main front camera. Is it a LiDAR unit? We're not sure that the FarmWise tractor needs this. Perhaps it's simply concealing a cellular antenna for transmitting data? Could be...
Now let's talk about the sensors we can't see. These will be an inertial sensor and two types of encoders. We will make an educated guess that the rear wheels include wheel encoders to provide odometry to be used with the GPS signal to triangulate and correct for field position based on distance traveled. We will make a second educated guess that the steering gear at the front of the tractor will include a third encoder to measure steering angle to again help triangulate field position based on steering inputs that the tractor receives. Lastly, we'd guess that an inertial measurement unit (an IMU in the parlance of our industry) is integrated into the tractor design to measure movement in six degrees of freedom. This would be used to understand if the tractor is weeding on a sloped area, or, more importantly, if the tractor enters a dangerous scenario where the pitch of the ground it is on is too steep in any angle for safe operation.
And, beyond that, that is...well, that is that for the tractor. It's a refreshingly simple design, and we applaud the FarmWise team for creating such an elegant sensor array for this portion of its robot! Now...to the weeder!
The weeder is where the sensor array heads to the next level...or so we think. After all, this is where the FarmWise robot must employ its smarts to determine what is a weed, and what isn't a weed, and make the appropriate decision to maximize crop yield while minimizing crop damage.
The first thing we notice on the weeder is the flexible skirting that covers the unit. Beyond making it look cool like a hovercraft, this has a very practical purpose: it controls lighting within the weeder. It also allows for the use of two sensing modalities that require precise lighting control: RGB CMOS cameras, and depth (3D) cameras.
Within the weeder, we believe these two modalities are used in concert to identify weeds and then guide the cutting blades that destroy them.
Because of the width of the weeder, there will be multiple RGB CMOS cameras. It appears that the mechanical setup of the weeder has it covering six rows of crops as the FT35 moves down a crop row.
Based on the above image from FarmWise's weed and plant identifying system, it appears that a single camera can cover two rows. Therefore, it is likely that at least three RGB CMOS cameras are contained in the weeder. With that said, the above image was from early days of FarmWise's prototype development, and we'd assume that they have moved to a camera per row, for a total of six CMOS cameras. But CMOS alone can't let FarmWise accurately weed. We think FarmWise needs depth sensing, too.
Depth sensing can do two things for FarmWise: first, it can help judge the distance from the weeder to the weed, for the most accurate cut. Secondly, it can provide infrared visual data that can be used in conjunction with color visual data to provide corroboration of what is a weed, and what is a crop. That infrared data source is a key reason why the flexible rubber shield is in place, as we noted earlier. Sunlight can easily wash out infrared sensor signals, so blocking as much of it as possible is critical.
In addition to needing to control for ambient lighting, we see a few more design challenges that the FarmWise team have likely addressed (or will need to address) for their weeder's sensors.
First: dust and debris. Even if the FT35 rolls sloooowly over the field, it is going to kick up dust. Wind will kick up dust. Nearby workers will kick up dust. The weeder's blades will kick up dust. And if there's anything we learned over our collective decades building sensors, lenses love to collect dust. At some point, enough dust will settle on the lenses of the weeder's cameras and depth sensors that they will become occluded and fail to operate correctly until they are cleaned. This may be addressed by FarmWise by simply adding periodic cleaning as a maintenance task, or they may address it through mechanical means. For instance, the FarmWise weeder might have an automated brush that can sweep dust away from lenses. Or it may have an onboard air compressor that sends blasts of air across the lenses to free loose dust and dirt.
Next up, we have temperature. Farms can be in cool climates, hot climates, and cool and hot climates. Temperature can impact the performance of some sensors, but depth sensors in particular can be sensitive to temperature. This is because the lasers that generate infrared light in depth sensors are tuned to a specific wavelength, but a laser's ability to generate laser light at a specific wavelength is temperature-dependent! Therefore, the FarmWise weeder's depth sensors may need active temperature control to ensure that laser light can be reliably generated at the proper wavelength, regardless of ambient temperature. A
Finally, we have vibration. Given the size and robustness of the weeder, no doubt the sensors are very rigidly mounted to the chassis of the weeder. However, no matter how rigidly mounted the sensors may be, the vibrations and knocks that the weeder will experience during operation will invariably throw them out of calibration.
Indeed, in FarmWise's image above, we see a fairly massive calibration target being employed under the weeder. Like any other multi-sensor device, FarmWise's weeder will need periodic recalibration to ensure that crops aren't damaged and weeds are thoroughly removed. As an added note, our guess is that FarmWise calibrates neighboring RGB CMOS cameras to each other to provide a single composite image to be processed by its computer vision pipeline. This is to go beyond weeding, to also deliver datasets to farms such as total number of plants counted, healthy plants versus diseased plants, and other important metrics.
💡 We'd like to note to the FarmWise team that we can likely reduce their calibration time to minutes to maximize time spent in the field. ;)
Because the FarmWise weeder uses so many cameras, and because they are likely used in combination with each other, then there are sensor fusion tasks to be done. If our assumption is correct, and there are between nine and twelve cameras employed in the weeder, then there are nine to twelve clocks to be synchronized to ensure that all data is received with correct timestamps. Similarly, there may be cameras that capture the same scene, but believe they see different things due to some of the other challenges we noted earlier, such as dust occlusion. The FarmWise team must therefore create some sort of method of resolving camera conflict.
We'll note one last challenge: data size and transfer. As we noted earlier, there are up to 12 camera streams (along with the HDR camera at the front of the tractor) that the FT35 must manage. Undoubtedly, this much visual data is taking up at least a core's worth of onboard processing, if not more. Given the vast size of farm fields, there is likely no WiFi network that the FT35 can access on demand, so any data transfer will occur over a cellular data link. That means that additional processing like image compression must be done onboard the FT35 to minimize file sizes that are transferred to FarmWise's cloud service (oh...we also believe that FarmWise has a cloud service).
So there you have it; there's our best guess for the sensor package, sensor advantages, and sensor challenges for the FarmWise FT35's tractor and weeder. What do you think — did we nail it, or did we fail miserably? If you work at FarmWise and happen to see this, let us know!
And, if you enjoyed this post, stay tuned...we'll be doing more sensor array breakdowns like this soon. And if you are working on a vision-enabled platform, check out the Tangram Vision SDK — we're making sensor integration, calibration, and maintenance much more robust, but also much less time consuming. Thanks for reading, follow us on Twitter, and we'll see you again soon!