Tutorial: RVIZ tango_sensors - OwlSoul/tango_sensors GitHub Wiki

Introduction

This is a small tutorial to demonstrate "tango_sensors" app capabilities. By default, there are several transforms published together with raw data, to provide simple "switch on and demonstrate how it works" scenario. You are not advised to use these transforms in real applications, the point of the app is to provide raw data only.

Demonstration

  1. Launch your ROS master on the PC.

ubuntu@ubuntu:~ roscore

  1. Launch "tango_sensors" app on the Tablet, connect to your ROS master. Enable all sensors (Pose estimation, Point Cloud, both cameras).

You can check if all topics are being displayed, and check if messages arrive (do this for Pose):

ubuntu@ubuntu:~ rosnode list

"rosnode list" result

ubuntu@ubuntu:~ rostopic list

"rostopic list" result

ubuntu@ubuntu:~ rostopic hz /tango_sensors/tango_pose

"rostopic hz /tango_sensors/tango_pose" result

ubuntu@ubuntu:~ rostopic hz /tango_sensors/point_cloud

"rostopic hz tango_sensors/tango_pose" result

  1. Launch rviz.

ubuntu@ubuntu:~ rviz

Add topics "by topic", all of them: both camera images, point_cloud, tango_pose and tango_pose_pcl (pose published with point cloud). Move around with your tango, see how it works.

RVIZ 01

Blue arrow is the pose taken when Point Cloud was available, Red arrow is the Tango VIO pose estimation.

RVIZ 02

A good test is to point your tango to a wall. Remember, Point Cloud computation will not work outside in the daylight, use a room with electric light for this test.

RVIZ 03

As you can see, point cloud data is being published together with VIO Pose estimation and both camera images . Basically, this is it.