Tutorial: RVIZ tango_sensors - OwlSoul/tango_sensors GitHub Wiki
Introduction
This is a small tutorial to demonstrate "tango_sensors" app capabilities. By default, there are several transforms published together with raw data, to provide simple "switch on and demonstrate how it works" scenario. You are not advised to use these transforms in real applications, the point of the app is to provide raw data only.
Demonstration
- Launch your ROS master on the PC.
ubuntu@ubuntu:~ roscore
- Launch "tango_sensors" app on the Tablet, connect to your ROS master. Enable all sensors (Pose estimation, Point Cloud, both cameras).
You can check if all topics are being displayed, and check if messages arrive (do this for Pose):
ubuntu@ubuntu:~ rosnode list
ubuntu@ubuntu:~ rostopic list
ubuntu@ubuntu:~ rostopic hz /tango_sensors/tango_pose
ubuntu@ubuntu:~ rostopic hz /tango_sensors/point_cloud
- Launch rviz.
ubuntu@ubuntu:~ rviz
Add topics "by topic", all of them: both camera images, point_cloud, tango_pose and tango_pose_pcl (pose published with point cloud). Move around with your tango, see how it works.
Blue arrow is the pose taken when Point Cloud was available, Red arrow is the Tango VIO pose estimation.
A good test is to point your tango to a wall. Remember, Point Cloud computation will not work outside in the daylight, use a room with electric light for this test.
As you can see, point cloud data is being published together with VIO Pose estimation and both camera images . Basically, this is it.