Description of the system - grvcTeam/multidrone_planning GitHub Wiki

Mission Controller

This is the core of the MultiDrone system, and it receives cinematography missions from the Director’s Dashboard in XML format (a graphical tool to design missions and not included in the repository), computes mission plans and sends the corresponding tasks to each drone while monitoring the execution. It can be divided into different parts:

Interface with Dashboard

The Mission Controller periodically sends a system status message to the Dashboard, and it is able to receive the following commands:

  • Event enrolment
  • Mission enrolment
  • Validate missions
  • Select mission and sequence roles
  • Trigger events
  • Clear events and missions
  • Abort mission

High-level planning

The received mission needs to be transformed into specific tasks for each drone. The High-level Planning performs this task allocation taking into account drone constraints (e.g. battery, starting position, etc.) and scenario constraints (e.g. safe landing sites, no-fly zones, etc.). It tries to maximize the shooting time.

Mission and events manager

This module receives the events from the Dashboard and sends them to the drones. It also monitors the status of the drones, looking at their battery level, the actions they are performing and their position.

Onboard Scheduler

The Onboard Scheduler runs on board each drone and receives the list of actions corresponding to that drone from the Mission Controller. Then, it is in charge of executing them sequentially, synchronizing their start and end and calling the Action Executor for the actual execution of the different navigation or shooting actions. This module reacts to different alarms and emergencies, such as low drone battery, being able to command a safe path to a landing site thanks to an included path planner. It also reports about the drone status to the Mission Controller.

Action Executor

The Action Executor runs on board each drone and is responsible for executing the navigation and shooting actions as they are received from the Onboard Scheduler. It can be divided in three parts:

  • Drone Control: According to the type of action (e.g. lateral, flyby, orbit) and based on the estimates of the target’s position, the Action Executor computes reference trajectories on-the-fly. These are tracked using a hierarchical controller with an inner-outer loop control structure. The outer-loop controller is implemented on the Action Executor and provides linear velocity references sent to the UAL to feed the inner-loop control system.

  • Gimbal Control: When the shooting action is specified, the user determines whether the gimbal tracks a GPS or a visual target. In GPS tracking, position measurements for both the drone and target together with attitude measurements for the gimbal are used to compute the relative attitude error, whereas tracking based on vision uses the image error directly. For both cases, a feedback law based on the attitude error is used to compute angular velocity commands sent to the Gimbal Camera Interface module.

  • Camera Control: This submodule is responsible for remotely changing the camera focus and zoom settings, as well as start/stop recording. Focus updates are triggered by either the Auto-Focus Assist module or manually by the director through the Dashboard. In contrast, the zoom controller is always running when the gimbal is performing visual tracking.

Gimbal Camera Interface

The Gimbal Camera Interface implements the communication bridge between the Action Executor and the gimbal hardware. It connects to the gimbal through a serial communication (UART), sending the velocity commands and retrieving its status: motor angles and speed and gimbal orientation, as well as other low-level parameters. For compactness, it comprises the interactions with both the gimbal and camera. If desired, two nodes can also be launched separately. At any moment during the mission, the gimbal backup pilot can get manual control over the gimbal, with the possibility of switching back to automatic mode. Camera settings (white balance, ISO, etc.) can be changed when the automatic mode is selected.

Microcontroller Board for Gimbal Camera Interface

A Teensy LC board provides the core of the hardware pipeline for the camera and gimbal, channeling the commands originated from the onboard computer and the RC command to the gimbal and camera. It is also responsible for swapping between manual control (pilot) and automatic.

UAL

The UAL (UAV Abstraction Layer) is in charge of abstracting the user from the drone hardware, providing a common interface to access autopilot commands. This module is able to provide drone positioning in a standard format regardless of the underlying autopilot, as well as to receive and execute navigation commands such as land, take-off, go to waypoint, set velocity, etc.

This module is available in a different repository.