Hand Eye Calibration - mfkenson/MAEG5755-2021-Team-PARK GitHub Wiki
- This repository contains a set of moveit collections link including hand_eye_calibration plugin.
- Therefore after a successful catkin_make and sourcing the workspace, the panel is available in rviz.
First we have to launch the baxter as we did before. Then in rviz add the hand_eye_calibration panel and set the corresponding topics.
Install them otherwise rviz would keep complaining solver failed to return....Ref Install them from apt
sudo apt install ros-noetic-handeye ros-noetic-baldor
or use the one in this repo.
- Eye to hand if you would like to have a fixed camera in the environment. (i.e. static/fixed spatial relationship between the camera frame and the world frame)
- Eye in hand if you would like to have a camera and the robot joint glued together.
ChArUco can also be used as the target object in either configuration. https://docs.opencv.org/master/da/d13/tutorial_aruco_calibration.html
<launch>
<!-- The rpy in the comment uses the extrinsic XYZ convention, which is the same as is used in a URDF. See
http://wiki.ros.org/geometry2/RotationMethods and https://en.wikipedia.org/wiki/Euler_angles for more info. -->
<!-- xyz="0.00468524 -0.0155631 0.00681717" rpy="0.128149 1.61547 1.43858" -->
<!-- From the plugin. Failed..so I manually rewrote this using static_transform_publisher from tf
<node pkg="tf2_ros" type="static_transform_publisher" name="camera_link_broadcaster"
args="0.00468524 -0.0155631 0.00681717 0.50848 0.513413 0.489235 0.488369 right_gripper d435_color_optical_frame" />
-->
<node pkg="tf" type="static_transform_publisher" name="right_gripper_to_d435_color_optical" args="
0.00468524 -0.0155631 0.00681717 0.50848 0.513413 0.489235 0.488369 right_gripper d435_color_optical_frame 10"/>
</launch>
Notice I manually rewrote this using static_transform_publisher from tf. The tf2_ros pkg did not work.
Sometimes the camera extrinsic is given my other source rather than the moveit handeye plugin. We can derived the translation vector and quaternion from camera extrinsic to make our own roslaunch file to publish the static transform.
import numpy as np
from spatialmath import SE3, SO3, UnitQuaternion
from spatialmath.base import *
full_matrix = np.array([
[0.9971, -0.0362, -0.0666, 1.1101 ],
[0.0171, -0.7482, 0.6632, -0.2717],
[-0.0738, -0.6625, -0.7454, 0.4262],
[0, 0, 0 ,1]
])
rotation_matrix = np.array([
[0.9971, -0.0362, -0.0666],
[0.0171, -0.7482, 0.6632],
[-0.0738, -0.6625, -0.7454]
])
so3_obj = SO3(rotation_matrix, check=False)
uq = UnitQuaternion(so3_obj)
print(uq) #w, x, y, z