Software - TI-capacitor/T1GR3 GitHub Wiki
Autonomous Drone Software List
This document outlines the various software components used in the development of an autonomous drone for the Raytheon Autonomous Vehicle Competition.
PX4
PX4 is an open-source flight control software designed for drones and other unmanned vehicles. It provides:
- Stabilization: Handles low-level flight controls, ensuring smooth and stable flight.
- Navigation: Supports GPS-based navigation and autonomous flight.
- Sensor Integration: Works with various IMUs, barometers, and GPS modules.
- Communication: Interfaces with ground stations via MAVLink.
PX4 runs on a Pixhawk or similar flight controller, which communicates with the drone's motors, sensors, and control systems.
Gazebo
Gazebo is a powerful simulation tool used for robotics applications. It enables the testing of autonomous drone behaviors in a simulated environment.
Gazebo Classic
Originally called "Gazebo," it was the first version of the simulation software. It supported physics-based simulations but had limitations in scalability and modularity.
Ignition Gazebo
Developed to enhance usability and features, Ignition Gazebo introduced:
- A modular framework for more efficient simulations.
- Enhanced sensor modeling and rendering capabilities.
- Improved physics engines for more realistic dynamics.
Gazebo Sim
As of January 2025, Ignition Gazebo was renamed Gazebo Sim when Gazebo Classic reached the end of its lifecycle. Gazebo Sim is now the standard for drone simulation and testing.
ROS (Robot Operating System)
ROS is a middleware that enables communication between different components of the autonomous drone system.
- ROS Topics: Used for messaging between nodes, enabling real-time data exchange.
- ROS Services: Provide request-response communication mechanisms.
- ROS Actions: Manage asynchronous tasks, such as long-duration maneuvers.
- ROS Nodes: Individual processes that execute specific functionalities (e.g., navigation, computer vision, control systems).
ROS serves as the interface between the control computer and PX4, allowing for high-level flight planning and execution using Python or C++ libraries.
MAVSDK
MAVSDK is a modern API that facilitates communication with PX4 via MAVLink. It provides:
- Drone Control: Arm/disarm, takeoff, land, and maneuver drones.
- Offboard Mode: Enables sending velocity and position commands to control the drone externally.
- Telemetry: Retrieves live flight data, including GPS position, altitude, battery level, and system status.
In the project, MAVSDK is used in the autonomous_drone.py script to manage drone flight logic.
uXRCE-DDS (PX4-ROS 2/DDS Bridge)
PX4 uses uXRCE-DDS middleware to allow uORB messages to be published and subscribed on a companion computer as though they were ROS 2 topics. This provides a fast and reliable integration between PX4 and ROS 2, and makes it much easier for ROS 2 applications to get vehicle information and send commands.
-Architecture: PX4 uses uXRCE-DDS middleware to allow uORB messages to be published and subscribed on a companion computer as though they were ROS 2 topics. This provides a fast and reliable integration between PX4 and ROS 2, and makes it much easier for ROS 2 applications to get vehicle information and send commands.
-In order for PX4 uORB topics to be shared on the DDS network you will need uXRCE-DDS client running on PX4, connected to the micro XRCE-DDS agent running on the companion.
-The PX4 uxrce_dds_client publishes to/from a defined set of uORB topics to the global DDS data space.
-The eProsima micro XRCE-DDS agent runs on the companion computer and acts as a proxy for the client in the DDS/ROS 2 network.
-The agent itself has no dependency on client-side code and can be built and/or installed independent of PX4 or ROS.
-Code that wants to subscribe/publish to PX4 does have a dependency on client-side code; it requires uORB message definitions that match those used to create the PX4 uXRCE-DDS client so that it can interpret the messages.
source here
OpenCV (Computer Vision)
OpenCV is used for image processing and computer vision tasks, such as:
- ArUco Marker Detection: Identifies and tracks fiducial markers for navigation and landing.
- Feature Extraction: Helps in obstacle avoidance and scene recognition.
- Camera Processing: Enhances visual input for better drone decision-making.
The drone scans for an ArUco marker in the search area, locates the designated landing spot, and autonomously performs a precision landing.
Autonomous Drone Architecture
The drone operates in multiple phases:
- Connection & Initialization: Establishes MAVSDK connection with PX4.
- Takeoff Sequence: Takes off and reaches a specified altitude.
- Search Pattern Execution: Flies in a grid search pattern looking for the target ArUco marker.
- Landing Sequence: Once the marker is found, the drone lands precisely on the designated target.
This workflow is managed by ROS 2 in conjunction with MAVSDK and OpenCV, ensuring efficient autonomy and real-time responsiveness.
Launch System
The autonomous_drone.launch.py script is responsible for initiating the autonomous drone application. It executes the autonomous_drone.py script within a ROS 2 launch framework.
Competition Considerations
The drone software is designed to comply with the Raytheon Autonomous Vehicle Competition requirements, which include:
- Autonomous navigation without human intervention.
- Real-time target detection using vision-based ArUco markers.
- Direct drone-to-drone communication without reliance on ground control.
- Mission flexibility, adapting to changing field conditions and event scenarios.
Future Improvements
To enhance performance, the following improvements are planned:
- AI-based path optimization using reinforcement learning.
- Sensor fusion combining LiDAR, IMU, and visual data for improved localization.
- Enhanced communication through ROS 2 DDS for better real-time coordination.
This wiki serves as a foundational reference for the development and refinement of the autonomous drone project.