Autonomous - ISU-MAVRIC/Old-MAVRIC-Systems GitHub Wiki

Scarab

Overview

Our Autonomous system works in three main parts: The state machine, arUco tag detection, and pathing. It receives data from external GPS, IMU, and Camera sensor ROS publisher nodes. The base station handles enabling/disabling the autonomous system as well as GPS waypoint input.

Data

The autonomous manipulates GPS coordinates as latitude-longitude pairs in signed decimal degrees. The rover's current heading and bearing to its target are calculated in degrees relative to true north. Distances are expressed in meters whenever possible. Time is expressed in seconds. Global configuration values are stored in auto_globals.py.

State Machine

Our state machine handles deciding the rover's next actions based on the data from our sensors. The state updates at a rate of 2Hz.

Structure

[chart]

Idle State

When autonomous navigation is disabled, when we have no next waypoint, or invalid data from the IMU or GPS is received, the rover will stay in the Idle State. This state doesn't do anything except wait for these conditions to be true to begin autonomous navigation.

DriveTowardsWaypoint State

This state is entered when autonomous is enabled, we have the next waypoint to navigate to, and our angular_error is less than angular_error_threshold in auto_globals.py. This state will send commands to our drive train to move the rover towards the waypoint, updating linear_error as we get closer. This state also corrective steers, meaning that if the angular_error is less than angular_error_threshold, the rover will adjust its path to reach the waypoint.

SteerTowardsWaypoint State

This state is entered when autonomous is enabled, we have the next waypoint to navigate to, and our angular_error is greater than our angular_error_threshold. In this case, we will attempt to point steer the rover to face the desired waypoint and then enter into the DriveTowardsWaypoint state.

ReachedWaypoint State

This state is entered when autonomous is enabled and our linear_error is less than our linear_error_threshold. The rover then indicates that it has reached the waypoint through the onboard LEDs. This waypoint is then removed from the list of next waypoints. The next state can either be Idle State, DriveTowardsWaypoint State, or SteerTowardsWaypoint state depending on if autonomous is still enabled and if we have a next waypoint.

GPS & IMU

The rover receives GPS position data from a Adafruit BNO055 GPS board via the ** GPS_Streamer.py ** module. This publisher sends data to our GPS topics at a rate of 10Hz. The autonomous takes is current position and the coordinates of the next waypoint and, based on the WGS-84 ellipsoid model of Earth, solves the inverse geodesic and expresses its length as the linear_error in meters. This process is used to ensure accuracy; GPS will be the primary navigation method, so it must be as accurate as we can manage.

We also calculate the desired heading from the inverse geodesic. Since our GPS heading is only accurate when we are moving across large distances, we also have an IMU for accurately measuring our current heading. Then, with our current heading and our desired heading, we calculate our angular error.

The values linear_error and angular_error are used by the autonomous state machine to determine which state we should enter next.

Calibration and Calibration Matrices

The BNO055 has a built-in function in the header data for getting both the calibration offsets and the calibration status of the sensor. Calling either getSensorOffsets or isFullyCalibrated' will give you the data previously referred to. More information is on Adafruit_BNO055.h for the functions available to the BNO055 in Arduino IDE. Other steps to calibrate:

Using calibration matrices with the context that x,y,z matrix is the final calibrated matrix. More information on the specific matrices at VectorNav. Calibration can also be done by comparing the data of the separated accelerometer to the data outputted from the accelerometer on the rover at the time.

Computer Vision

The URC challenge guidelines describe a computer vision portion of the autonomous. We use OpenCV2's built in arUco tag detection software to find arUco tags in our main dome camera's vision. We then calculate the heading towards this tag by mapping its position in the camera's FOV to degrees. Detected tags are stored in a list in auto_globals.py

LED Indicator