Configuring Odometry in ROS - UTRA-ART/Caffeine GitHub Wiki
Written by: Erick Mejia Uzeda
Odometry data is published to ROS generally on the odom
topic. An Odometry message (nav_msgs/Odometry
) is structured as follows:
# This represents an estimate of a position and velocity in free space.
# The pose in this message should be specified in the coordinate frame given by header.frame_id.
# The twist in this message should be specified in the coordinate frame given by the child_frame_id
Header header
string child_frame_id
geometry_msgs/PoseWithCovariance pose
geometry_msgs/TwistWithCovariance twist
- Where
header
specifies a sequence ID (for ordering messages), a timestamp (for synchronizing messages), and a frame_id (coordinate frame the pose is given in). - Where
child_frame_id
specifies the coordinate frame that twist is given in. - Where
PoseWithCovariance
indicates the currently estimated position and orientation (pose) of the robot based off the odometry sensor. It also has an extra covariance property that indicates the uncertainty of the given pose value. - Where
TwistWithCovariance
indicates the currently estimated linear and angular velocity (twist) of the robot based off the odometry sensor. It also has an extra covariance property that indicates the uncertainty of the given twist value.
Given that we have multiple sensors that provide us Odometry data, we would like to leverage them all to give us more accurate localization and movement readings. The robot_localization
ROS package combines Odometry data to do just that. This video gives a good high level overview of what robot_localization
does and how to use it.
This section will follow with general settings that need to be set up for robot_localization
and assumes all your Odometry sensors have been configured to publish some form of Odometry message on a ROS topic. See the next section for how to configure the sensors we use.
To install robot_localization
run:
$ sudo apt-get install ros-melodic-robot-localization
NOTE: If this does not work, try installing the package into your workspace from source. See the ROS answers page for details.
NOTE: The following information can be found in more detail at the
robot_localization
rosdocs.
To set up robot_localization
in a launch file for sensors that publish the following message types:
nav_msgs/Odometry
geometry_msgs/PoseWithCovarianceStamped
geometry_msgs/TwistWithCovarianceStamped
sensor_msgs/Imu
We launch the following the node:
# [*kf] can be either "ekf" (Extended Kalman Filter) or "ukf" (Unscented Kalman Filter)
<node pkg="robot_localization" type="[*kf]_localization_node" name="[*kf]_localization" output="screen" >
...
# Specify ROS params here
...
</node>
Each sensor to be used must be named appropriately (depending on the type of sensor it is) and specify the topic publishing the appropriate message type. Here is the general sensor specification:
<param name="[sensor]" value="[namespace]/[topic]"/>
Each sensor is configured to specify which values it is providing to robot_localization
using a 5 x 3 boolean matrix as follows:
<rosparam param="[sensor]_config">[x_pos, y_pos, z_pos,
roll, pitch, yaw,
x_vel, y_vel, z_vel,
roll_rate, pitch_rate, yaw_rate,
x_accel, y_accel, z_accel]</rosparam>
To use the GPS with robot_localization
, we interface it with its navsat_transform_node
. This node will then publish an Odometry
message which contains the GPS coordinates in the robot's world frame. Then, this message is passed into the [*kf]_localization_node
as an input to be fused.