Concepts - terrytaylorbonn/auxdrone GitHub Wiki
24.0623
This section describes the core ZiptieAI concepts
- 1 Ziptie drones
- 2 Drone AI
- 3 AI drone dev process (for pros)
1 Ziptie drones
There are basically 2 aspects to the ziptie concept.
1.1 Your first drones build(s) should be test platforms(s)
You can buy a drone that has everything in a very compact package. But you can't take it apart. You can learn how to operate, but not how to build. You can't customize.
ZiptieAI is about DIY to learn the tech. For that you need to build your own test drones that have been ziptied together, that give you the space to experiment and debug. Your first drone(s) should be test platforms.
1.2 Your future drone builds should be designed for quick upgrades / modifications
And your future drones should always be built designed for modifications. For example, the ziptie drone became famous on the Ukrainian battlefield. Usually military hardware goes through a rigorous long-term dev process, but small drones are different. You need to create mission specific platforms that meet the immediate requirements.
2 Drone AI
AI (object detect / recog) started out as the core focus of my ZiptieAI project (the drone was just the AI carrier). And it still is.
AI on a quadcopter typically runs on the
- FC (flight controller)
- CC (companion computer)
What is (small cheap drone) AI?
- Performed by FC:
- 2.1 Fly by wire (similar to modern jets).
- 2.2 Mission automation
- Performed by CC:
- 2.3 Object detection / recognition
- 2.4 Pilot assistance (similar to driver assistance in modern cars).
- 2.5 Autonomous flight (the drone can fly a pre-determined flight or modify the flight plan in response to external events)
2.1 Fly by wire (FC)
Just as in a modern fighter jet, the quadcopter pilot does not control the flight control surfaces (the 4 rotors). Instead, the pilot inputs flight commands using an RC. The flight controller (Pixhawk 6c for example) performs all of the complex flight control.
2.2 Mission automation
Mission software (such as QGroundControl or Mission Planner) is used to create a flight plan that is uploaded to the FC.
2.3 Object detection / recognition
The AI for this project focuses mainly on object detection and recognition. The CC (companion computer or mission computer)
- Receives input from the camera via ROS messages (or a direct connection to the camera without ROS).
- Runs an AI routine to detect or recognize object.
- Sends messages to the ground station or to the FC (to modify the flight plan).
2.4 Pilot assistance
Object detection could be used to alert a pilot of some detected object. This could be really useful when a pilot is flying in a complex environment. This is comparable to collision avoidance and driver alert in modern cars.
Modern AI algorithms running on cheap HW can sometimes do a better job of detection than humans.
2.5 Autonomous drone control (by CC)
An autonomous AI drone can
- Fly the drone (continue the mission or return home) if the connection with the operator is lost.
- React to unexpected situations. For example, in the pic below, at X something occurs (maybe a special object was recognized) that causes the CC to send commmands to the FC to
- (1) take special action then
- (2) return to home (H).
3 AI drone dev process (for pros)
(links below may be out of date; they are not maintained)
This section shows how I originally (24 Jan) planned to approach building AI drones. Based on what I read, I thought that building flying drones was the easy part. So I focused first on simulation and AI (major efforts).
I was wrong. The process shown below is more for pros. However, I still think a beginner can get some benefit from these concepts.
This project defines 6 development stages:
-
Part 1 Total simulation . Simulate everything in sim world (Gazebo) on a Linux PC. Run a sim mission (for example, a sim camera on a sim copter recognizes a sim person and tells the copter to land near the person (search and rescue)).
The diagram above also shows
-
GCS. Ground control software (runs on PC) that creates the flight plan that is uploaded to the FC (the copter follows this flight plan unless instructed (for example, by the CC) to do otherwise. Also could include RC.
-
RC. Joystick.
-
-
Part 2 Real AI HW. Running AI algorithms on the real HW (HW specifies is quite important for AI).
- A real PI camera sends video to the real Nano.
- The real Nano (or PI4) performs object recognition.
- The sim drone (SITL running on Ubuntu PC) sends sim sensor data (UDP) to the mission software program (on the Nano).
- The mission program running on the Nano sends flight control commands (TCP) to the sim drone. basically you test your AI autonomous program safely on sim drone.
-
Part 3 Real FC (HITL). Adding a real FC (with simulated frame, sensors and world).
-
Part 4 Basic platform build. Test flights and verification of AI/autonomous modes.
-
Part 5 Mission platform build. STM32 tools, FC firmware dev and APIs, ROS, and wired comms.
- Part 6 Mission platforms. Building specific mission platforms, such as for mine clearance of target lockon.
TODO: take simulink concepts from this doc HAS_HITL_CONCEPTS_ML_8c_ML_Matlab_v24_24.0222.docx -->