Autonomous Navigation

                    graph LR;
        
                    subgraph Crop Follow
                    %% A([Wait]) 
                    B([Crop Follow]) 
                    C([Stop]) 
                    D([Drive Slow])
                    end
        
                    subgraph Manual Control
                    E([Manual])
                    end
        
                    E --> |Toggle Crop Follow| B
                    %% A --> |Aligned| B
                    B --> |Danger| C
                    B --> |Safe| B
                    B --> |Warning| D
                    C --> |Warning| D
                    C --> |Safe| B
                    D --> |Safe| B
                    D --> |Danger| C
        
                    %% A --> |Toggle Manual| E
                    B --> |Toggle Manual| E
                    C --> |Toggle Manual| E
                    D --> |Toggle Manual| E
                

State machine diagram for different navigator states

Crop Row Following

The robot uses an Intel Realsense camera and OpenCV to maintain a straight path down a crop row. It isolates plants using HSV thresholding to mask non-green pixels, then crops the image to reduce error and computation time. Crop row lines are determined by analyzing green pixel ratios and smoothing peaks to identify rows. Assuming three crop rows, the system tries to minimize squared distances to the lines and calculates the vanishing point, which sets the robot's heading angle. A PID controller converts this angle into angular velocity, adjusting the linear velocity proportionally to ensure precise navigation.

Object Detection

Since farmers will be working around the crop rows where the robot operates, the robot needs to be aware of its surroundings. We use Hokuyo's URG-04LX laser sensor, which allows coverage for the front and most of the sides. When the scanner detects an object, it enters one of three states: Danger, Warning, or Safe, depending on the proximity of the object. Based on the state, the robot will stop, slow down, or continue moving, respectively.

Lighthouse

To visually indicate the robot's "safety state," the object detector sends its state to the LED lighthouse, which then activates the corresponding colored light. Additionally, a buzzer is triggered when the robot is in either the Danger or Warning state, ensuring that nearby individuals can hear it even if they cannot see the lights.

Evaluation

Using a 30 second video running at 30 frames per second and running each frame through our line detector at 320x240 resolution we got all three crop lines in 904/924 frames or 97.8% of the frames. We can also look at a cloud of calculated vanishing points to see how the estimated heading point changes over the 900 frames. We can see that with the exception of 11 outlier points almost 99% of our points match what seems to be the center of the crop row. Remember here that only the x value of the point matters as it is what is used to calculate the angle.