Use and Effect - WilsonGuo/FastLivo_Replication GitHub Wiki
It seems like you're guiding through the adaptation of the Fast Livo package to work with the Livox Mid360 LiDAR sensor. Let's summarize the steps and modifications needed based on the information provided:
-
preprocess.cpp (
avia_handler
function):- This function processes data received from the Avia LiDAR sensor (which is similar to the Livox Mid360).
- Modifications include handling point cloud data, applying feature extraction logic, and organizing point clouds into buffers based on scan lines.
void Preprocess::avia_handler(const livox_ros_driver2::CustomMsg::ConstPtr &msg) { // Code to process and extract features from Livox Avia (similar to Livox Mid360) }
-
laserMapping.cpp (
publish_frame_world_rgb
function):- Adjustments made to publish RGB point cloud data considering the field-of-view differences between Avia and Mid360.
- Converts world coordinates to camera pixel coordinates and retrieves RGB pixel values from the camera image.
void publish_frame_world_rgb(const ros::Publisher & pubLaserCloudFullRes, lidar_selection::LidarSelectorPtr lidar_selector) { // Code to publish RGB point cloud data adjusting for FOV differences }
-
Create
camera_pinhole_hk.yaml
:- Define camera intrinsic parameters like focal length and principal point.
- Example configuration:
image_width: 1280 image_height: 720 camera_name: camera camera_matrix: rows: 3 cols: 3 data: [800.0, 0.0, 640.0, 0.0, 800.0, 360.0, 0.0, 0.0, 1.0] distortion_model: plumb_bob distortion_coefficients: rows: 1 cols: 5 data: [-0.08, 0.11, 0.0, 0.0, 0.0] rectification_matrix: rows: 3 cols: 3 data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0] projection_matrix: rows: 3 cols: 4 data: [800.0, 0.0, 640.0, 0.0, 0.0, 800.0, 360.0, 0.0, 0.0, 0.0, 1.0, 0.0]
-
Create
mid360.yaml
:- Define camera extrinsic parameters including transformation matrices from camera frame to LiDAR frame.
- Example configuration:
cam_to_lidar: translation: [0.1, 0.0, 0.2] rotation: [0.0, 0.0, 0.0]
Adjust
translation
androtation
values based on your specific setup.
-
Create
mapping_mid360.launch
:- Launch file to initialize Fast Livo with parameters loaded from
mid360.yaml
andcamera_pinhole_hk.yaml
. - Example launch configuration:
<launch> <rosparam command="load" file="$(find fast_livo)/config/mid360.yaml" /> <node pkg="fast_livo" type="fastlivo_mapping" name="laserMapping" output="screen"> <rosparam file="$(find fast_livo)/config/camera_pinhole_hk.yaml" /> </node> <node pkg="rviz" type="rviz" name="rviz" args="-d $(find fast_livo)/rviz_cfg/loam_livox.rviz" /> <node pkg="image_transport" type="republish" name="republish" args="compressed in:=/left_camera/image raw out:=/left_camera/image" output="screen" respawn="true"/> </launch>
- Launch file to initialize Fast Livo with parameters loaded from
- After making these modifications, rebuild your catkin workspace using
catkin_make
and source the setup script (source devel/setup.bash
). - Launch the Fast Livo node using
roslaunch fast_livo_ws/src/FAST-LIVO/launch/mapping_mid360.launch
.
These steps should help you integrate and configure Fast Livo to effectively work with the Livox Mid360 LiDAR sensor, ensuring proper data processing and visualization in ROS. Adjust parameters and configurations as per your specific hardware setup and requirements.