Calibration - WilsonGuo/FastLivo_Replication GitHub Wiki

Essential Preparations Before Running Fast Livo:

There are two crucial tasks before running Fast Livo:

  1. Time Synchronization
  2. Sensor Parameter Calibration

These tasks significantly influence the quality of Fast Livo's final output.

Time synchronization has been thoroughly explained in the first tutorial, utilizing a hardware synchronization method. This tutorial will guide you through the internal and external calibration of the sensors (camera and lidar).

1. Camera Intrinsic Calibration

Camera Calibration:

Camera Calibration Wiki - This is a ROS package for calibrating monocular or stereo cameras using a chessboard, not AprilTag. The official website explains that this package is based on OpenCV, and the principle is the same. It provides examples for calibrating monocular and stereo cameras, which you can refer to if interested.

  1. Install the calibration package:
sudo apt install ros-$ROS_DISTRO-camera-calibration
  1. Print the chessboard: You can download and print the file (centered in original size): Chessboard calibration board. Chessboard Calibration Board

Calibration Principle: Calibration Principle

  1. Start the calibration:

a. Open the camera node:

roslaunch /home/wilson/livox_driver_ws/src/hikrobot_camera/launch/hikrobot_camera.launch

b. Open the calibration node:

rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.102 image:=/left_camera/image

c. Move according to the prompts: Calibration Movement

Move continuously until the "CALIBRATE" button turns green, then you can click to calibrate. Note that the shooting time should not be too long and the frame rate not too high; otherwise, adding too many images will slow down the calibration process. Generally, the number of samples should not exceed 100.

d. Save the results: Click the "CALIBRATE" button and wait for a while. Once calibration is complete, it will look like this: Calibration Results

Calibration Save

Both the "SAVE" and "COMMIT" buttons will turn green, and the terminal will display the calibration parameters. Click the "SAVE" button and wait a few seconds to save the results. The calibration results are saved in /tmp/calibrationdata.tar.gz, which can be manually copied. The content includes parameter files and the images used for calibration: Calibration Data

2. Extrinsic Calibration

Extrinsic calibration between the camera and lidar can be done in two ways:

Option 1: Target-less Calibration

livox_camera_calib - This tool provides robust, high-accuracy extrinsic calibration between high-resolution LiDARs (such as Livox) and cameras in a target-less environment. It operates in both indoor and outdoor scenes and only requires edge information from the scene. It can achieve pixel-level accuracy similar to or exceeding target-based methods if the scene is suitable.

The final result is an extrinsic.txt file: Extrinsic File

This is a 4x4 homogeneous transformation matrix: Homogeneous Transformation Matrix

Option 2: Target-based Calibration

velo2cam_calibration

Usage Guide

Steps to follow:

a. Launch the camera:

roslaunch /home/wilson/livox_driver_ws/src/hikrobot_camera/launch/hikrobot_camera.launch

b. Verify camera detection:

roslaunch velo2cam_calibration mono_pattern.launch camera_name:=/left_camera image_topic:=/left_camera/image frame_name:=hikrobot_camera

c. Launch the lidar:

roslaunch /home/wilson/livox_driver_ws/src/livox_ros_driver2/launch_ROS1/mid360.launch

d. Verify lidar detection:

roslaunch velo2cam_calibration lidar_pattern.launch cloud_topic:=/livox/lidar

e. Adjust filter parameters:

rosrun rqt_reconfigure rqt_reconfigure

f. Launch the calibration node:

roslaunch /home/wilson/livox_driver_ws/src/velo2cam_calibration-master/launch/registration.launch

This method is more user-friendly and easier to operate than target-less calibration.

Calibration Results: A file named calibrated_tf.launch will be generated in the launch folder: Calibration Launch File

Calibration Transform

Calibration Visualization

Supplementing the Camera's Original Coordinate System: Original Camera Coordinate System

The original camera coordinate system has the Z-axis forward, the X-axis to the left, and the Y-axis downward. Here, the result file creates a proxy coordinate system, rotated_hikrobot_camera, to align with the ROS coordinate system. If the TF transformation does not meet our needs, a 4x4 homogeneous matrix is our goal. Add the following node to transform the relationship:

#include <ros/ros.h>
#include <Eigen/Dense>
#include <tf/transform_broadcaster.h>
#include <tf/message_filter.h>
#include <tf/transform_listener.h>
#include <pcl_conversions/pcl_conversions.h>
#include <pcl_ros/transforms.h>

int main(int argc, char** argv){
  ros::init(argc, argv, "my_tf_listener");
  ros::NodeHandle node;
  
  tf::TransformListener listener;  
  ros::Rate rate(10.0);
  
  while (node.ok()){
    tf::StampedTransform transform_lidar2cam;

    try {
      tf::StampedTransform transform_lidar;
      tf::StampedTransform transform_cam;
      listener.lookupTransform("/livox_frame", "/rotated_hikrobot_camera", ros::Time(0), transform_lidar);
      listener.lookupTransform("/rotated_hikrobot_camera", "/hikrobot_camera", ros::Time(0), transform_cam);

      std::cout << "********************frame_id_:" << std::endl;
      std::cout << transform_lidar.frame_id_ << std::endl;
      
      Eigen::Matrix4f lidar2cam;
      Eigen::Matrix4f cam2cam;
      pcl_ros::transformAsMatrix(transform_lidar, lidar2cam);
      pcl_ros::transformAsMatrix(transform_cam, cam2cam);
      
      std::cout << "final2:" << std::endl;
      std::cout << (cam2cam * lidar2cam).inverse() << std::endl;

    } catch (tf::TransformException &ex) {
      ROS_ERROR("%s", ex.what());
      ros::Duration(1.0).sleep();
      continue;
    }

    rate.sleep();
  }
  return 0;
}

The resulting output is shown below: Calibration Matrix Result

With this, the extrinsic calibration is complete.

⚠️ **GitHub.com Fallback** ⚠️