moveit_wiki - IRS-group/isr_tiago_docs GitHub Wiki
Enable/Disable Octomap
Change octomap range and resolution
roscd tiago_moveit_config
vi launch/tiago_moveit_sensor_manager.launch.xml
vi launch/sensor_manager.launch.xml
vi config/sensors_rgbd.yaml
Enable/Disable Octomap
Enter inside the robot: tiago_lan
su
rw
chroot /ro
rosed tiago_moveit_config move_group.launch
// Change the camera argument to true/false
exit
ro
reboot
Pick and place
The manipulation node is automatically launched inside the robot and in the simulation.
So when using the robot you don't have to do anything, in the simulation, just launch it by running: roslaunch mbot_simulation_environments tiago_isr_testbed.launch
.
If you require the grasping functionalities and extra predefined motions run:
roslaunch tiago_manipulation tiago_manipulation.launch
The manipulation can be used through the API, by calling the pick _object
method.
You can trigger a pick movement by publishing the grasp pose directly on the topic /tiago_manipulation_ik/end_effector_pose
Pick and place (OLD)
If you not in the real robot you can run the simulation by launching:
roslaunch tiago_gazebo tiago_gazebo.launch public_sim:=true robot:=steel world:=tutorial_office gzpose:="-x 1.40 -y -2.79 -z -0.003 -R 0.0 -P 0.0 -Y 0.0" use_moveit_camera:=true
Make sure play motion is running. If you are in the simulation is automatically run during bringup. If you need to do it manually just run:
rosrun play_motion play_motion
Make sure that the bayes_objects_tracker target_frame (this frame is set inside the bayes_objects_tracker launchfile) exists and we can transform objects from that frame to the base_footprint.
Launch the object detector, object localizer and Bayes object tracker:
roslaunch bayes_objects_tracker bayes_objects_tracker_no_namespaces.launch
Launch the pick_and_place_server
roslaunch manipulation pick_place_server.launch
The manipulation actions are implemented in the file:
actions_tiago/src/actions_tiago_ros/manipulation.py
You can import it and use it inside your code by doing:
from manipulation import Manipulation
manipulate = Manipulation()
manipulate.pick(pick_object_name='cup', lift_object=True)
manipulate.place(predefined_location='table', tuck_arm=True)
manipulate.predefined_motion(motion_name='home')
Tools
If you need to adjust the head:
rosrun rqt_joint_trajectory_controller rqt_joint_trajectory_controller
If you want to move the arm/torso to a pre-defined configuration:
rosrun play_motion run_motion <configuration_name>
To check the predefined configurations run:
rosparam list | grep "play_motion/motions" | grep "meta/name" | cut -d '/' -f 4
The bayes object tracker is automatically triggered inside the manipulation script, but if you need to do it manually just publish:
rostopic pub /bayes_objects_tracker/event_in std_msgs/String "data: 'e_start'"
Create new motions
There are a few steps to follow in order to create new motions.
- Pose the robot
- Capture the values of the joints
- Save joints' values in a yaml file
To pose the robot to a desired position, there are many ways. It can be done manually with the physical robot by moving every joint or either use rqt_joint_trajectory_controller or the motion planning tool in RViz.
If you want to use rqt_joint_trajectory_controller, there is a menu where you can select the group of joints and then you are allowed to change the position of every joint. To do it, just run:
rosrun rqt_joint_trajectory_controller rqt_joint_trajectory_controller
Press the power button to set up, select the group of joints and change the position of the joints.
As soon you have the desired position there are two ways of continue.
First is done by reading the state of the arm and torso and paste them into a yaml file.
To read the state of the arm:
rostopic echo /arm_controller/state
To read the state of the torso:
rostopic echo /torso_controller/state
Then, paste the values into your new_motions.yaml file. It should be similar to this image.
After that it has to be loaded to the parameter server
The other way of doing it is with the play_motion_builder.
This tool presents a simple interface to create play_motion based motions by defining lists of keyframes which the system will then interpolate between. The tool simplifies capturing, editing and modifying these keyframes,
To create new motions make sure play_motion_builder packages are installed correctly. Everything needed is here https://github.com/pal-robotics/play_motion_builder.git
Make sure play motion builder is running. To do it just run:
rosrun play_motion_builder play_motion_builder_node
Then, we run rqt_play_motion_builder to use the tool which allows simple control of the motion creation pipeline.
`rosrun rqt_play_motion_builder rqt_play_motion_builder``
Now it is quite intuitive. Press New to set up, there is a box where you can select the group joints which you want to safe and also you can add the head's joints if it is required. Then press to Capture Keyframe and edit the time if it is needed. Save it and you can use it with the rest of pre-defined configurations.
After all, in both ways it's needed to load the yaml file to the parameter server.
rosparam load <yaml_file_path>the new motions are saved:
rosparam load <yaml_file_path>
To run a motion:
rosrun play_motion run_motion <motion_name>