ROW-SLAM

ROW-SLAM: Under-Canopy Cornfield Semantic SLAM

Cornfield weed control conventionally has relied heavily on herbicides which are undesirable due to environmental and health concerns and can not be used in organic fields. The alternative, manual weeding, is labor-intensive and costly. Therefore, there has been significant interest in robotic weeding. However, most existing techniques focus on early season weeding. During this time, since the canopy is not closed, usually a reliable GPS signal is available. Further, the robots can obtain top-down views, which are convenient for detection and localization of weeds. In this work, we focus on mid-season weeding where the robot must operate under the canopy.

In this paper, we introduce a multi-view vision system for the detection and 3D localization of corn stalks in mid-season corn rows, called ROW SLAM. We present this system (Fig. 1, 3) as a prototype of the vision module for an autonomous weeding robot currently under development. In our approach, we model corn stalks found in the same row as a plane that is perpendicular to the ground. A ground-view camera (back camera) performs SLAM using ground plane features to obtain 3D odometry. We implement a Structure-from-Motion (SfM) strategy that accommodates the multi-view inputs to estimate the 3D pose of the corn stalk plane. Combining this motion estimation module with the object detection and tracking modules, we build a map that has both metric (location and orientation) and semantic information of the corn stalks in it. We present field results which demonstrate the accuracy and robustness of our approach toward weeding mid-season cornfields.

[Paper]