This repository includes the complete codebase, documentation, and reports for the Autonomous Tractor Project developed during the ABE 6990 course at Mississippi State University. The project aims to create an autonomous tractor capable of:
- Precise Navigation: Reaching a target goal using GPS localization.
- Obstacle Avoidance: Using an RGB-D camera for real-time detection of obstacles.
- Actuator Control: Controlling acceleration (linear actuator) and steering (stepper motor) with commands derived from PID-based control logic.
The tractor system is developed and integrated using ROS 2, Python, and Arduino to achieve seamless communication between sensors, actuators, and computational modules.
The tractor uses a Proportional-Integral-Derivative (PID) Controller to:
- Compute linear velocity (
linear.x
) to drive the tractor forward or backward based on the distance to the target. - Compute angular velocity (
angular.z
) to adjust the steering wheel and correct heading errors.
- An RGB-D camera calculates the distance to obstacles and publishes it to the
camera/distance
topic. - If an obstacle is detected within 5 meters, the system halts all movement to ensure safety.
The following diagram illustrates the overall system architecture:
The following diagram illustrates the overall ROS2 integration message flow:
As shown in the figures above, the ROS 2 system operates as follows:
- GPS Driver Node:
- The GPS unit collects LLA (Latitude, Longitude, Altitude) data and publishes it as a
NavSatFix
message to the/gps/fix
topic.
- The GPS unit collects LLA (Latitude, Longitude, Altitude) data and publishes it as a
- Camera Driver Node:
- The RGB-D camera collects depth data and publishes it as a
3DArray
message to the/camera/depth
topic.
- The RGB-D camera collects depth data and publishes it as a
- Simple GPS Controller Node:
- Subscribes to the
/gps/fix
and/camera/depth
topics. - Processes GPS and camera data to compute control commands.
- Publishes a
Twist
message to the/cmd_vel
topic, containing linear and angular velocity commands.
- Subscribes to the
- Serial Command Publisher Node:
- Subscribes to the
/cmd_vel
topic. - Converts the
Twist
message into serial commands and sends them to the Arduino via a serial connection.
- Subscribes to the
- Arduino:
- Receives serial commands and translates them into PWM signals.
- These signals control:
- Linear Actuator: For forward/backward acceleration based on
linear.x
. - Stepper Motor: For steering adjustments based on
angular.z
.
- Linear Actuator: For forward/backward acceleration based on
This architecture ensures seamless communication between sensors, actuators, and control modules to achieve autonomous navigation and obstacle avoidance.
The tractor uses GPS coordinates (latitude, longitude, altitude) for navigation:
- The target goal and the tractor’s current position are converted from LLA (Latitude, Longitude, Altitude) to ENU (East-North-Up) coordinates using
ecef_to_enu
. - The PID algorithm computes:
- Distance Error: The Euclidean distance between the current position and the target.
- Heading Error: The angular difference between the desired orientation and the current orientation.
The system continuously updates the tractor's linear and angular velocities until it reaches the goal or an obstacle is detected.
The RGB-D camera module publishes:
- RGB Images: Published on
camera/rgb
for visualization and debugging. - Obstacle Distance: Published on
camera/distance
for real-time safety checks.
If the obstacle distance is ≤ 5 meters, the tractor stops and waits until the path is clear.
- Linear Actuator: Controls forward/backward acceleration based on
linear.x
velocity. - Stepper Motor: Controls steering adjustments based on
angular.z
velocity.
The control commands are processed by an Arduino, which executes low-level motor control for smooth operation.
The tractor's navigation logic is implemented using the PID algorithm. Below is an explanation of the included code:
-
GPS Conversion:
- The goal and tractor positions are converted to local ENU coordinates using
lla_to_ecef
andecef_to_enu
. - Example:
goal_ecef = lla_to_ecef(lat_goal, lon_goal, alt_goal) goal_enu = ecef_to_enu(goal_ecef, orig_ecef)
- The goal and tractor positions are converted to local ENU coordinates using
-
PID Logic:
- Linear Velocity:
v = Kp_linear * e_distance # Proportional control for forward/backward motion
- Angular Velocity:
omega = Kp_angular * e_theta # Proportional control for steering
- These values are updated at every timestep and published to
/cmd_vel
.
- Linear Velocity:
-
Trajectory Plotting:
- The trajectory and error values are logged and plotted in real-time for debugging and analysis.
The PID algorithm ensures smooth and accurate navigation:
- The tractor follows the shortest path to the goal.
- Errors are reduced dynamically using proportional control.
for t in np.arange(0, T, dt):
e_distance = np.sqrt((x_target - x) ** 2 + (y_target - y) ** 2)
desired_theta = np.arctan2(y_target - y, x_target - x)
e_theta = wrap_to_pi(desired_theta - theta)
# PID control
v = Kp_linear * e_distance
omega = Kp_angular * e_theta
# Update state
x += v * np.cos(theta) * dt
y += v * np.sin(theta) * dt
theta += omega * dt
The RGB-D camera module processes image data and publishes it to ROS 2 topics. The data flow is illustrated below:
- Camera Input:
- Captures RGB images and depth data in real-time.
- ROS 2 Topics:
camera/rgb
: RGB images for visualization.camera/depth
: Depth data to calculate the distance to obstacles.
- Obstacle Detection:
- The obstacle distance is published on
camera/distance
. If the distance ≤ 5 meters, the tractor halts.
- The obstacle distance is published on
Feature | Description |
---|---|
GPS-Based Navigation | Precise localization using ENU coordinates and PID control. |
Obstacle Detection | Real-time safety checks using RGB-D camera depth data. |
Actuator Control | Smooth control of acceleration (linear actuator) and steering (stepper motor). |
ROS 2 Integration | Seamless communication between sensors and actuators using ROS 2 nodes and topics. |
git clone https://github.com/yourusername/Ros2-Autonomous-Tractor.git
cd Ros2-Autonomous-Tractor
Install required Python packages:
pip install -r requirements.txt
- Start the Camera Module:
ros2 run depthai_publisher depthai_publisher_node
- Start the Navigation Controller:
ros2 run tractor_navigation pid_controller_node
The tractor reached to goal point autonomously using PID algorithm and data from GPS and help of actuators.
- Goal Accuracy: Reaches the goal within a 5-meter tolerance.
- Obstacle Detection: Detects and halts for obstacles ≤ 5 meters.
- PID Stability: Smooth navigation with minimal overshoot.
- Base Station Setup:
- Improve localization accuracy within centimeter-level precision by integrating a base station for RTK GPS corrections.
- Use of LIDAR and SLAM:
- Integrate LIDAR sensors and implement SLAM (Simultaneous Localization and Mapping) for advanced path planning.
- Complex Terrain Navigation:
- Enhance the tractor's capabilities to handle complex and uneven terrain for broader agricultural applications.
- Charles Raines
- Sushant Gautam
- Andres Arias Londono
The license to use this project is mentioned here.