LiDAR Setup - ArcturusNavigation/all_seaing_vehicle GitHub Wiki
Back: Tutorials |
---|
Connecting the LiDAR
When using the Velodyne LiDAR, power the interface box using 12V (make sure it's 12V by checking the back of the power brick!) and connect the interface box directly to the Jetson using an Ethernet cable. Once power is provided, you should be able to notice that the LiDAR is spinning visibly (if using HDL-32E) or audibly (if using VLP-16 or SICK multiScan).
Network setup
Note: when running this step, make sure that the Jetson and your computer are connected to the same network.
By default, Velodyne LiDARs have IP address 192.168.1.201
and broadcast data to 255.255.255.255
(local network). If you type the LiDAR's IP address in the address bar, you can enter the configuration settings page.
After connecting the LiDAR to the Jetson, try pinging or ssh-ing to the Jetson from your computer. From my experience, when you connect the LiDAR to the Jetson, eth0
automatically takes priority over wlan0
, preventing other computers to ssh
into the Jetson via WiFi. To allow the Jetson to find your computer via WiFi, first, find your computer's IP address using ip addr
. Depending on your WiFi card, the WiFi network interface name will be something like wlan0
or wlp0s20f3
, and the corresponding IP address being something similar to 192.168.1.X
.
Check if your computer's IP address is already in the Jetson's routing table by running route -n
. If your IP address is one of the destination addresses, then you should be all set. Otherwise, continue this tutorial.
Now, allow the Jetson to route data to your computer by running the following command:
nmcli con modify <profile_name> +ipv4.routes "<your_ip_address>/32 0.0.0.0"
where <profile_name>
is the name of the network profile (typically the SSID arcturus
) and <your_ip_address>
is your computer's IP address which you determined. Internally, this adds your computer's IP address to the routing table which is used to determine where data packets are sent. After you connect and reconnect WiFi on the Jetson, you should see a new entry when running route -n
, and your computer and the Jetson should be able to communicate with each other. Again, try ping <jetson_ip_address>
to confirm connection.
Finally, use the following command if you want to remove entries from the routing table:
nmcli con modify <profile_name> -ipv4.routes "<your_ip_address>/32 0.0.0.0"
Camera-LiDAR Calibration
To accurately overlay the point cloud to the camera image, we need to know the extrinsic matrix which expresses the relative pose between the camera and the LiDAR. While there are many packages which performs this calibration, most are deprecated and/or difficult to use. A recent and previously tested package is Direct Visual LiDAR Calibration which also has a nice property of being target-less, meaning we do not require a calibration board. Its installation and usage is outlined clearly in the package's documentation. Here are some notes and tips for an accurate calibration:
- You can install GTSAM using the pre-built binaries rather than building from source by running
sudo apt install ros-humble-gtsam
- For 16-beam LiDARs (e.g. VLP-16), collect 5-10 bagfiles which are approximately 30-40 seconds long
- For LiDARs with more beams (e.g. HDL-32E), you can collect less and shorter bagfiles (10-20 seconds)
- Each bagfile should capture a different environment or perspective
- The captured environment should have clear features to easily identify point cloud to image correspondences (doors, windows, etc)
- If using SuperGlue, check the point cloud to image correspondence images to make sure they make sense. If not, manual initial estimates may result in better performance, especially for 16-beam LiDARs.
Note: The ZED wrapper publishes the camera_info
distortion model as rational_polynomial
and not plumb_bob
, which causes the package to error-out. To fix this, replace the if-statement inside the create_camera
function in create_camera.cpp
with if (camera_model == "plumb_bob" || camera_model == "rational_polynomial")
.
Finally, to update the transforms, replace the values in all_seaing_description/launch/static_transforms.launch.py
with the new values in calib.json
.
Running the LiDAR
If using the HDL-32E, use the following command:
ros2 launch all_seaing_driver 32e_points.launch.py
Open up RViz with the fixed frame set to velodyne
:
ros2 run rviz2 rviz2 -f velodyne
Finally add the PointCloud2 topic with topic name velodyne_points
to visualize the LiDAR point cloud and scan
to visualize the laserscan!