MoveItNotes - SharedAutonomyToolkit/shared_autonomy_perception GitHub Wiki
In this project, our goal was to look at how improving the robot's representation of the world using shared autonomy would improve manipulation outcomes. We chose to use MoveIt! for the manipulation part of our pipeline for a number of reasons:
- it appeared to be the most state-of-the-art collection of arm planning algorithms
- actively developed / active mailing list (unlike arm-navigation)
- reasonably well-documented interface for how to update its internal world representation
- It already had pick/place actions implemented
I'm impressed with the plans generated by MoveIt, in that they do a very good job of creating smooth motion given a correct world representation. However, as the below FAQ can attest, we ran into a number of problems, the biggest (and as-yet unresolved) of which is its internal octomap data structure.
We had hoped to allow MoveIt! to subscribe to the kinect data and update its internal octomap for obstacle avoidance (it has nice features for re-evaluating plans mid-execution if something gets in the way), and only having to provide bounding boxes to semantically label the objects that we want to manipulate.
However, the octomap updating is buggy enough that this is unworkable.
- Clearing voxels doesn't work
- Voxels fully inside objects don't get cleared (Ioan says this is expected behavior)
- Sometimes, voxels that actually intersect bounding boxes persist (WTF?)
- There appears to be a cache that saves old sensor data and refreshes the map with it, even after resetting it by sending an empty diff.
(AKA problems LEL ran into trying to get pick-and-place working ...)
Q: Why isn't the gripper actuating?
A: previously the pr2's gripper commands used the joints ['l_gripper_l_finger_joint', 'l_gripper_r_finger_joint', 'l_gripper_r_finger_tip_joint', 'l_gripper_l_finger_tip_joint']. However, under pr2_moveit_config, only the joint ['l_gripper_motor_screw_joint']. Yes, this makes no sense, given that that joint doesn't actually change value when grasping, but that's what we have. If using the cluster_grasp_planner, check the config for the pr2 hand and gripper.
Q: Why is there and offset between RobotModel and the MotionPlanning SceneRobot?
A1 - changed the FixedFrame in rviz. Doesn't matter to what, but once it's changed, the two models will refresh and snap together.
A2 - if you're running the ekf to publish a transform between odom_combined and base_link, it'll be slightly non-zero, leading to an apparent offset between the two. To fix this, you can kill the ekf and run rosrun tf static_transform_publisher 0 0 0 0 0 0 odom_combined base_footprint 33
Q: In move_group output [ WARN]: Unable to transform object from frame 'object_frame' to planning frame '/odom_combined'
A: the grasp planning and bounding box code published ONE message defining an object frame; moveit continually checks that all frames in the graph are connected. Commented out the sendTransform lines in object_manipulator/cluster_blunding_box_finder*.py and pr2_gripper_grasp_planner_cluster/point_cluster_grasp_planner.py
Q: All of my grasps are failing in Pick action
A:
-
try visualizing with debug:=true or <arg name="info" "value="true"/> in move_group launch file
-
listen to MarkerArray with topic move_group/display_grasp_markers
colored by stage #; RGB; meaning for pick pipeline (same colors used for other pipelines)
-
0 - (0.5, 0.5, 0.5)
-
1 - (1.0, 0.0, 0.0) reachable and valid
-
2 - (1.0, 0.5, 0.0) approach and translate
-
3 - (0.0, 1.0, 1.0) plan stage
-
4 - (0.0, 1.0, 0.0) success!
-
5 - (1.0, 0.0, 1.0)
Q: OK, so what the heck is the pick pipeline, and what do the stages mean?
A: All the code for this is in moveit_ros/manipulation/pick_place/src. The pick logic is in pick.cpp, and the planning pipeline is in manipulation_pipeline.cpp. The stages are also defined here. 1 - ReachableAndValidPoseFilter - looks for a valid IK solution for the grasp pose that's not in collision with anything else 2 - ApproachAndTranslateStage - tries to find valid plans for pre-grasp->grasp, then grasp->retreat. 3 - PlanStage - includes more trajectory generation (gripper open/close, approach to the pre-grasp)
For each supplied grasp, its constraints are added as a plan to the pipeline, then the pipeline is run until one succeeds. For each stage of the pipeline, the evaluate() function is called with the proper PlanningStage.
Q: in python, tf.frameExists() returns False even when tf_echo works?
A: I have no idea why, but tf_echo/tf_monitor can handled a leading '/' on frame names, while tf.TransformListener apparently cannot. For now, this was fixed by modifying openni.launch to not use
Q: I'm trying to pick up an object from the table, but plans always fail. I'm only adding the object to the collision map, and it often dails with:[ INFO] [1384385436.850180018]: Found a contact between '' (type 'Object') and 'box1' (type 'Robot attached'), which constitutes a collision. Contact information is not stored.
A: This seems to have been helped by writing my own bounding box computation taht's a strict bounding box, rather than using the one from the arm_navigation pipeline. It's also helped by waiting a while, or running it again ... it appears that the octomap doesn't update very quickly. (There's a known bug where the octomap doesn't properly update from added objects - hack is to clear the octomap just before adding the objects)
Q: I'm consistently getting errors saying that the same object is in the world and attached to the robot:
[ INFO] [1384829044.710711604]: Trajectory component 'retreat' is invalid
[ INFO] [1384829044.711454923]: Found a contact between 'box1' (type 'Object') and 'box1' (type 'Robot attached'), which constitutes a collision. Contact information is not stored.
[ INFO] [1384829044.711480876]: Collision checking is considered complete (collision was found and 0 contacts are stored)
[ INFO] [1384829044.711516411]: Stopping execution because the path to execute became invalid (probably the environment changed)
[ INFO] [1384829044.711565384]: Cancelling execution for l_arm_controller
[ INFO] [1384829044.711640468]: Stopped trajectory execution.
A: did you forget to set is_diff in the PickupGoal? pickup_goal.planning_options.planning_scene_diff.is_diff = True pickup_goal.planning_options.planning_scene_diff.robot_state.is_diff = True
Q: Why did all the kinect points turn black?
A: No idea - this seems to happen after some amount of time running the openni driver. Just restart it.
Q: What controls the buffer in the octomap around collision_objects that are added to the PlanningScene?
A: In moveit_pr2/pr2_moveit_config/config/sensors_kinect.yaml, padding_offset: 0.1
This affects BOTH the self-filter and the objects added into the environment. Set to 0.03 seems to work OK - I suspect that it needs to be at least 1/2 the size of the octomap's cells!
Q: what's the difference between using moveit_commander vs. the actionlib interface?
A: So far as I can tell, just the richness of the commands you can send. (moveit_commander does send actionlib messages, eventually, but takes FOREVER to start up, and outputs debug info that looks like it's starting a whole separate move_group node)
moveit_commander is a great source of example messages for controlling move_group (don't even think about trying to fill one in w/o a working example!)
Q: Why aren't the voxels clearing when I add an object to the collision world?
A: Bug in octomap. The workaround is to send a message that entirely clears the world, then add your objects. Subsequent kinect depth images will properly update only octomap cells not part of the objects.
Q: Why did rviz slow down so much when using the moveit MotionPlanning pluging?
A: If ORK is also running, it's sending object meshes, and that rendering is really slow (from email thread w/ Sachin & Ioan). It's slow even if they're not being displayed! Workaround is to add the line:
<remap from="recognized_object_array" to="WTF"/>
to pr2_moveit_config/launch/moveit_rviz.launch
Q: How do I get the kinect's data integrated into the octomap?
A: Those parameters are specified in pr2_moveit_config/config/sensors_kinect.yaml, and there are two different plugins for this.
Option 1:
-
sensor_plugin: occupancy_map_monitor/PointCloudOctomapUpdater
-
point_cloud_topic: /head_mount_kinect/depth_registered/points
Option 2:
-
sensor_plugin: occupancy_map_monitor/DepthImageOctomapUpdater
-
image_topic: /head_mount_kinect/depth_registered/image_raw
Option 2 seems to update faster, but leave more voxels inside the collision objects. If you don't want kinect updates, just remove those params. (Cleaner would be to just not include it in the launch)
Q: What causes the behavior where it grasps, lifts, then goes down to grasp again and just stops there??
[ INFO] [1386875507.060319583]: Stopping execution because the path to execute became invalid (probably the environment changed)
[ INFO] [1386875507.060384917]: Cancelling execution for l_arm_controller
[ INFO] [1386875507.060477977]: Stopped trajectory execution.
[ INFO] [1386875507.228189320]: Waiting for a 3.000000 seconds before attempting a new plan ...
[ INFO] [1386875510.228393713]: Done waiting
[ INFO] [1386875510.228470198]: Planning attempt 2 of at most 5
<..snip..>
However, it eventually fails, b/c
<..snip..>
[ERROR] [1386875510.363984441]: Attempting to attach object 'obj1' to link 'l_wrist_roll_link' but no geometry specified and such an object does not exist in the collision world
[ERROR] [1386875516.262494411]: Execution of path-completion side-effect failed. Preempting.
[ INFO] [1386875516.278695855]: Stopping execution due to preempt request
[ INFO] [1386875516.278755237]: Cancelling execution for l_arm_controller
[ INFO] [1386875516.278817774]: Stopped trajectory execution.
A: I'm assuming that this is a bug in moveit - I'm still trying to track it down.