Secondary Work - Pieman130/dtr_open GitHub Wiki

More work has been done integrating parts together.

Motors, servos, camera, and imu are all connected to the Raspberry Pi Zero.

Current pinouts on Raspberry Pi are:

Based on the state model from here I have setup a shell python script that handles the top level state flow. Using this state transition model

The top level script is currently called Blimpy and is based on the OODA loop model. The blimp makes a series of observations depending on its current state and based on the observations takes a specific action which is usually in the form of providing specific commands to the motors.

The basic steps are as follows:

  1. At initial start, it checks for a connected joystick, if none is found it spins until it is connected
  2. Read the joystick commands. Currently only mapped to the PS3 controller. To keep from inadvertently sending commands it requires a 'deadman switch' by holding down the SQUARE while pushing another control button.
  3. Blimpy remains in this state until START is pressed on the controller. This signifies the blimp is in position and ready to operate autonomously. It will also illuminate a blue LED on the RPi showing it is ready. Pressing the START button again will return Blimpy to manual control only
  4. Blimpy then makes observations (detections), shown in purple in the above diagram
  5. The blue triangles represent conditional statements to check the results of the operations. Each detection and subsequent conditional statement determines the current state
  6. The orange blocks represent actions that are taken based on current state

The structure is built but most of the observation and action functions are not implemented. To date we have:

  • Detect object (goal, target) - based on building a color mask from the converted HSV image. Target mask is built, but nothing is done with the goal colors (yellow and orange). The masks will need to be tailored for the specific lighting in the highbay
  • Detect object is currently implemented with OpenCV blob detection, but the currently set parameters do not lead to reliable detection. The idea should be find a large enough blob and hopefully that is the object you are looking for
  • Other detect object options could be contour detection, edge detection, or simply counting the number of white pixels in the mask
  • Detect April tag has rudimentary detection capability but it has not been tested for distances greater than a few inches or if the camera is off center/slanted from the April tag
  • Nothing is implemented for target captured or target in range, but notionally we would have to determine how far we are from the object. It might be possible to just do that with a color mask, but realistically will require a secondary sensor such as a range finder.
  • Camera position would affect how we determine if the target game ball is captured because we could run into the problem of the ball blocking the camera view so it couldn't find the goal shown in these two illustrations. Front mounted camera - easier to see if we have captured the ball, but would block the camera from detecting the goal and/or April tag Bottom mounted camera - could find the goal/April tag with game ball, but might be harder to determine when we have actually captured the ball.

Other things that have not be implemented:

  • The BNO055 IMU is integrated into the RPi and functions to initialize it in Blimpy, but it is not currently utilized. It will likely be useful in capturing the target or orienting to the goal when finer control is needed
  • There is also a PID controller will be needed for controlling both moving to an object, but also capturing the target. It should take the error from the observation (in most cases that will be the difference from the center of the object to the center of the camera) and provide a smoothed motor command to the action vector

On the RPi itself there are a few scripts for testing out various elements:

  • manual_control.py - allows control of the blimp via the PS3, will spin until PS3 controller connected via bluetooth
  • blob_test.py - uses OpenCV to develop a binary mask that is then used to determine where potential blobs are and then prints the blob sizes to screen. The various blob detector parameters are exposed as well.
  • april_test.py - A very simple example of the April tag detector utilizing the python3 library. It just prints out the value of the detected tag
  • IMU_test.py - Connects to the IMU and prints out raw accelerometer data and KF filtered Euler angles.

NOTE - If PS3 Controller doesn't connect

If for some reason the PS3 controller does not connect with the RPiZero you may have to restart the daemon that controls the connection: sudo systemctl restart sixad. Wait a few seconds and then press the PS button on the controller. Sometimes you have to try more than once.