Demos - nikolaradulov/SLAMFuse GitHub Wiki

Quick Start

Quick start from very beginning

Running loading algorithms and datasets within containers (Detailed)

NOTE: This article might be subject to further revisions

We assume that all the dependencies needed are installed and that the main container is present on the host machine. For this refer back to Installation Guide.

The starter.py wrapper

Using SLAMBench with GUI from within the container is dependent on the operating system of the host. To make matters simple for the user the starter.py program exists. This is essentially a wrapper that sets the required variables and command line for using the application. It's used to create volumes for the datasets, install the distribution for SLAMBench and run benchmarking.

nrad@LAPTOP-0R48DLRK:~/slambench$ python3 starter.py  -h
usage: starter.py [-h] {run,build,dataset} ...

This is a tool to run Docker containers with SLAMBench.

positional arguments:
  {run,build,dataset}  Select the mode of operation.
    run                When running the tool in run mode
    build              Running the tool in build mode
    dataset            Running the tool in dataset mode

optional arguments:
  -h, --help           show this help message and exit

Creating dataset volumes

Slambench can't use datasets out of the box but luckily tools have been provided to make life easier, in order for you to prepare the datasets you wish in the required format.

Existing datasets

We introduce out of the box support for the following datasets:

  • OpenLORIS [Shi et al, ICRA'20]: Lifelong SLAM dataset
  • Bonn Dynamic [Palazollo et al. IROS'19]: Dynamic scene dataset
  • UZH-FPV [Delmerico et al. ICRA'19]: Drone racing dataset
  • ETH Illumination [Park et al, ICRA'17]: Illumination changes dataset
  • VolumeDeform [Innmann et al, ECCV'16]: Non-rigid reconstruction
  • EuRoC MAV [Burri et al, IJRR'16]: Micro Aerial Vehicle dataset
  • ICL-NUIM [Handa et al, ICRA'14]: Synthetic dataset
  • TUM RGB-D [Sturm et al, IROS'12]: A standard SLAM benchmark

You can visualise the complete list of possible trajectories to build by python3 starter.py dataset -t list

nrad@LAPTOP-0R48DLRK:~/slambench$ python3 starter.py dataset -t list
Command: docker run slambench/main --list_datasets
The following datasets are built on your system and available for use:

Here is a list of the datasets available.
-e If you are using any of the following datasets, please refer to their respective publications:
        - TUM RGB-D SLAM dataset [Sturm et al, IROS'12]: https://vision.in.tum.de/data/datasets/rgbd-dataset
        - ICL-NUIM dataset [Handa et al, ICRA'14]: https://www.doc.ic.ac.uk/~ahanda/VaFRIC/iclnuim.html
        - EuRoC MAV Dataset [Burri et al, IJJR'16]: https://projects.asl.ethz.ch/datasets/doku.php
        - SVO sample dataset [Forster et al, ICRA 2014]: https://github.com/uzh-rpg/rpg_svo
        - Bonn RGB-D Dynamic Dataset [Palazzolo et al, IROS'19]: http://www.ipb.uni-bonn.de/data/rgbd-dynamic-dataset/
        - UZH-FPV Drone Racing Dataset [Delmerico et al, ICRA'19]: http://rpg.ifi.uzh.ch/uzh-fpv.html
        - OpenLORIS-Scene datasets [Shi et al, ICRA'20]: https://lifelong-robotic-vision.github.io/dataset/scene
=================================================================================================================

=================================================================================================================
SLAMBench integrates tools to automatically generate files compatible with SLAMBench from existing datasets.
SLAMBench cannot download the OpenLORIS data for you. Please download the data manually (*-package.tar) to ./datasets/OpenLORIS/
For details, please visit: https://lifelong-robotic-vision.github.io/dataset/scene

        ### TUM Testing and Debugging ###
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_xyz.slam
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_rpy.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_xyz.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_rpy.slam

        ### TUM Handheld SLAM ###
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_360.slam
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_floor.slam
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_desk.slam
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_desk2.slam
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_room.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_360_hemisphere.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_360_kidnap.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_desk.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_desk_with_person.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_large_no_loop.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_large_with_loop.slam

        ### TUM Robot SLAM ###
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_pioneer_360.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_pioneer_slam.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_pioneer_slam2.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_pioneer_slam3.slam

        ### TUM Structure vs Texture ###
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_nostructure_notexture_far.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_nostructure_notexture_near_withloop.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_nostructure_texture_far.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_nostructure_texture_near_withloop.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_structure_notexture_far.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_structure_notexture_near.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_structure_texture_far.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_structure_texture_near.slam

        ### TUM Dynamic Objects ###
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_desk_with_person.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_sitting_static.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_sitting_xyz.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_sitting_halfsphere.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_sitting_rpy.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_walking_static.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_walking_xyz.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_walking_halfsphere.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_walking_rpy.slam

        ### TUM 3D Object Reconstruction ###
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_plant.slam
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_teddy.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_coke.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_dishes.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_flowerbouquet.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_flowerbouquet_brownbackground.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_metallic_sphere.slam
         make ./datasets/TUM/freiburg2/rgbd_dataset_freiburg2_metallic_sphere2.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_cabinet.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_large_cabinet.slam
         make ./datasets/TUM/freiburg3/rgbd_dataset_freiburg3_teddy.slam

        ### ICL_NUIM Living Room ###
         make ./datasets/ICL_NUIM/living_room_traj0_loop.slam
         make ./datasets/ICL_NUIM/living_room_traj1_loop.slam
         make ./datasets/ICL_NUIM/living_room_traj2_loop.slam
         make ./datasets/ICL_NUIM/living_room_traj3_loop.slam

        ### ETHI Illumination Dataset ###
         make ./datasets/ETHI/ethl_real_flash.slam
         make ./datasets/ETHI/ethl_real_local.slam
         make ./datasets/ETHI/ethl_real_global.slam
         make ./datasets/ETHI/ethl_syn1.slam
         make ./datasets/ETHI/ethl_syn1_local.slam
         make ./datasets/ETHI/ethl_syn1_global.slam
         make ./datasets/ETHI/ethl_syn1_loc_glo.slam
         make ./datasets/ETHI/ethl_syn1_flash.slam
         make ./datasets/ETHI/ethl_syn2.slam
         make ./datasets/ETHI/ethl_syn2_local.slam
         make ./datasets/ETHI/ethl_syn2_global.slam
         make ./datasets/ETHI/ethl_syn2_loc_glo.slam
         make ./datasets/ETHI/ethl_syn2_flash.slam

        ### EuRoCMAV Machine Hall ###
         make ./datasets/EuRoCMAV/machine_hall/MH_01_easy/MH_01_easy.slam
         make ./datasets/EuRoCMAV/machine_hall/MH_02_easy/MH_02_easy.slam
         make ./datasets/EuRoCMAV/machine_hall/MH_03_medium/MH_03_medium.slam
         make ./datasets/EuRoCMAV/machine_hall/MH_04_difficult/MH_04_difficult.slam
         make ./datasets/EuRoCMAV/machine_hall/MH_05_difficult/MH_05_difficult.slam

        ### EuRoCMAV Vicon Room ###
         make ./datasets/EuRoCMAV/vicon_room1/V1_01_easy/V1_01_easy.slam
         make ./datasets/EuRoCMAV/vicon_room1/V1_02_medium/V1_02_medium.slam
         make ./datasets/EuRoCMAV/vicon_room1/V1_03_difficult/V1_03_difficult.slam
         make ./datasets/EuRoCMAV/vicon_room2/V2_01_easy/V2_01_easy.slam
         make ./datasets/EuRoCMAV/vicon_room2/V2_02_medium/V2_02_medium.slam
         make ./datasets/EuRoCMAV/vicon_room2/V2_03_difficult/V2_03_difficult.slam

        ### BONN Balloon ###
         make ./datasets/BONN/rgbd_bonn_balloon.slam
         make ./datasets/BONN/rgbd_bonn_balloon2.slam
         make ./datasets/BONN/rgbd_bonn_balloon_tracking.slam
         make ./datasets/BONN/rgbd_bonn_balloon_tracking2.slam

        ### BONN People ###
         make ./datasets/BONN/rgbd_bonn_crowd.slam
         make ./datasets/BONN/rgbd_bonn_crowd2.slam
         make ./datasets/BONN/rgbd_bonn_crowd3.slam
         make ./datasets/BONN/rgbd_bonn_person_tracking.slam
         make ./datasets/BONN/rgbd_bonn_person_tracking2.slam

        ### BONN Boxes ###
         make ./datasets/BONN/rgbd_bonn_kidnapping_box.slam
         make ./datasets/BONN/rgbd_bonn_kidnapping_box2.slam
         make ./datasets/BONN/rgbd_bonn_moving_nonobstructing_box.slam
         make ./datasets/BONN/rgbd_bonn_moving_nonobstructing_box2.slam
         make ./datasets/BONN/rgbd_bonn_moving_obstructing_box.slam
         make ./datasets/BONN/rgbd_bonn_moving_obstructing_box2.slam
         make ./datasets/BONN/rgbd_bonn_placing_nonobstructing_box.slam
         make ./datasets/BONN/rgbd_bonn_placing_nonobstructing_box2.slam
         make ./datasets/BONN/rgbd_bonn_placing_nonobstructing_box3.slam
         make ./datasets/BONN/rgbd_bonn_placing_obstructing_box.slam
         make ./datasets/BONN/rgbd_bonn_removing_nonobstructing_box.slam
         make ./datasets/BONN/rgbd_bonn_removing_nonobstructing_box2.slam
         make ./datasets/BONN/rgbd_bonn_removing_obstructing_box.slam

        ### BONN Synchronous and Static ###
         make ./datasets/BONN/rgbd_bonn_synchronous.slam
         make ./datasets/BONN/rgbd_bonn_synchronous2.slam
         make ./datasets/BONN/rgbd_bonn_static.slam
         make ./datasets/BONN/rgbd_bonn_static_close_far.slam

        ### UZHFPV Indoor forward facing Snapdragon ###
         make ./datasets/UZHFPV/indoor_forward_3_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_5_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_6_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_7_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_8_snapdragon.slam
         make ./datasets/UZHFPV/indoor_forward_9_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_10_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_11_snapdragon.slam
         make ./datasets/UZHFPV/indoor_forward_12_snapdragon.slam

        ### UZHFPV Indoor forward facing Davis ###
         make ./datasets/UZHFPV/indoor_forward_3_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_5_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_6_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_7_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_8_davis.slam
         make ./datasets/UZHFPV/indoor_forward_9_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_10_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_forward_11_davis.slam
         make ./datasets/UZHFPV/indoor_forward_12_davis.slam

        ### UZHFPV Indoor 45 degree downward facing Snapdragon ###
         make ./datasets/UZHFPV/indoor_45_1_snapdragon.slam
         make ./datasets/UZHFPV/indoor_45_2_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_3_snapdragon.slam
         make ./datasets/UZHFPV/indoor_45_4_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_9_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_11_snapdragon.slam
         make ./datasets/UZHFPV/indoor_45_12_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_13_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_14_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_16_snapdragon.slam

        ### UZHFPV Indoor 45 degree downward facing Davis ###
         make ./datasets/UZHFPV/indoor_45_1_davis.slam
         make ./datasets/UZHFPV/indoor_45_2_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_3_davis.slam
         make ./datasets/UZHFPV/indoor_45_4_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_9_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_11_davis.slam
         make ./datasets/UZHFPV/indoor_45_12_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_13_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_14_davis_with_gt.slam
         make ./datasets/UZHFPV/indoor_45_16_davis.slam

        ### UZHFPV Outdoor forward facing Snapdragon ###
         make ./datasets/UZHFPV/outdoor_forward_1_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/outdoor_forward_2_snapdragon.slam
         make ./datasets/UZHFPV/outdoor_forward_3_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/outdoor_forward_5_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/outdoor_forward_6_snapdragon.slam
         make ./datasets/UZHFPV/outdoor_forward_9_snapdragon.slam
         make ./datasets/UZHFPV/outdoor_forward_10_snapdragon.slam

        ### UZHFPV Outdoor forward facing Davis ###
         make ./datasets/UZHFPV/outdoor_forward_1_davis_with_gt.slam
         make ./datasets/UZHFPV/outdoor_forward_2_davis.slam
         make ./datasets/UZHFPV/outdoor_forward_3_davis_with_gt.slam
         make ./datasets/UZHFPV/outdoor_forward_5_davis_with_gt.slam
         make ./datasets/UZHFPV/outdoor_forward_6_davis.slam
         make ./datasets/UZHFPV/outdoor_forward_9_davis.slam
         make ./datasets/UZHFPV/outdoor_forward_10_davis.slam

        ### UZHFPV Outdoor 45 degree downward facing Snapdragon ###
         make ./datasets/UZHFPV/outdoor_45_1_snapdragon_with_gt.slam
         make ./datasets/UZHFPV/outdoor_45_2_snapdragon.slam

        ### UZHFPV Outdoor 45 degree downward facing Davis ###
         make ./datasets/UZHFPV/outdoor_45_1_davis_with_gt.slam
         make ./datasets/UZHFPV/outdoor_45_2_davis.slam

        ### SVO Artificial Dataset ###
         make ./datasets/SVO/artificial.slam

        ### OpenLORIS Cafe ###
         make ./datasets/OpenLORIS/cafe1/cafe1-1.slam
         make ./datasets/OpenLORIS/cafe1/cafe1-2.slam

        ### OpenLORIS Corridor ###
         make ./datasets/OpenLORIS/corridor1/corridor1-1.slam
         make ./datasets/OpenLORIS/corridor1/corridor1-2.slam
         make ./datasets/OpenLORIS/corridor1/corridor1-3.slam
         make ./datasets/OpenLORIS/corridor1/corridor1-4.slam
         make ./datasets/OpenLORIS/corridor1/corridor1-5.slam

        ### OpenLORIS Home ###
         make ./datasets/OpenLORIS/home1/home1-1.slam
         make ./datasets/OpenLORIS/home1/home1-2.slam
         make ./datasets/OpenLORIS/home1/home1-3.slam
         make ./datasets/OpenLORIS/home1/home1-4.slam
         make ./datasets/OpenLORIS/home1/home1-5.slam

        ### OpenLORIS Market ###
         make ./datasets/OpenLORIS/market1/market1-1.slam
         make ./datasets/OpenLORIS/market1/market1-2.slam
         make ./datasets/OpenLORIS/market1/market1-3.slam

        ### OpenLORIS Office ###
         make ./datasets/OpenLORIS/office1/office1-1.slam
         make ./datasets/OpenLORIS/office1/office1-2.slam
         make ./datasets/OpenLORIS/office1/office1-3.slam
         make ./datasets/OpenLORIS/office1/office1-4.slam
         make ./datasets/OpenLORIS/office1/office1-5.slam
         make ./datasets/OpenLORIS/office1/office1-6.slam
         make ./datasets/OpenLORIS/office1/office1-7.slam

        ### VolumeDeform VolumeDeform ###
         make ./datasets/VolumeDeform/adventcalender.slam
         make ./datasets/VolumeDeform/boxing.slam
         make ./datasets/VolumeDeform/hoodie.slam
         make ./datasets/VolumeDeform/minion.slam
         make ./datasets/VolumeDeform/shirt.slam
         make ./datasets/VolumeDeform/sunflower.slam
         make ./datasets/VolumeDeform/umbrella.slam
         make ./datasets/VolumeDeform/upperbody.slam

        Datasets with a ROS option: TUM
         Use the use_rosbag option to build a dataset from a ROS bag:
         make ./datasets/TUM/freiburg1/rgbd_dataset_freiburg1_xyz.slam use_rosbag

Creating a volume is simple: select the dataset mode and then using the -d flag specify the dataset to be built from the list. Using the -v flag specify the name of the volume.

nrad@LAPTOP-0R48DLRK:~/slambench$ python3 starter.py dataset -h
usage: starter.py dataset [-h] -t {list,make} [-v VOLUME_NAME] [-d DATASET]

optional arguments:
  -h, --help            show this help message and exit
  -t {list,make}, --type {list,make}
  -v VOLUME_NAME, --volume_name VOLUME_NAME
                        Specify the volume name.
  -d DATASET, --dataset DATASET
                        Specify the dataset.

Example 1

  1. I want to build the indoor_forward_3_snapdragon_with_gt in a volume named FDRAGON.
  2. I can execute python3 starter.py dataset -t list in order to see the exact name of the dataset that needs to be passed. I find the line make ./datasets/UZHFPV/indoor_45_3_snapdragon.slam. The name of the dataset is the portion after the make command, namely ./datasets/UZHFPV/indoor_45_3_snapdragon.slam
  3. Now all I have to do is execute the command python3 starter.py dataset -t make -d ./datasets/UZHFPV/indoor_45_3_snapdragon.slam -v FDRAGON
  4. You should now be able to see the volume within docker, containing all the needed data for running.

Running an algorithm

This assumes that you already have the algorithm volume built. If not please see the page here. For running any benchmark you will need to also have a dataset volume ready. See above for instructions.

There are 4 main modes in which a benchmark can be run with the docker.

nrad@LAPTOP-0R48DLRK:~/slambench$ python3 starter.py run -h
usage: starter.py run [-h] [-dv DATASET_VOLUME] [-d DATASET] [-a [ALGORITHM [ALGORITHM ...]]] -t {cli,gui,interactive-cli,interactive-gui} [-s SAVE_CONFIG] [slamopt [slamopt ...]]

positional arguments:
  slamopt

optional arguments:
  -h, --help            show this help message and exit
  -dv DATASET_VOLUME, --dataset_volume DATASET_VOLUME
                        Specify the volume to be mounted for the dataset.
  -d DATASET, --dataset DATASET
                        Specify the dataset file to be used
  -a [ALGORITHM [ALGORITHM ...]], --algorithm [ALGORITHM [ALGORITHM ...]]
                        Specify algorithms to be used. Must be of the form <algorithm_name>/<library_name>. Refer to the wiki for more information.
  -t {cli,gui,interactive-cli,interactive-gui}, --type {cli,gui,interactive-cli,interactive-gui}
  -s SAVE_CONFIG, --save_config SAVE_CONFIG
                        Save current configuration to config file

These are

  • cli = run a simple benchmark with with just command-line output
  • gui = run a benchmark with the visualiser
  • interactive-cli = lets you interact within the container environment. only command-line output
  • interactive-gui = lets you interact within the container environment. GUI output enabled through X-windows.

Example 2

  • When running an algorithm the full path does not need to be specified. For example if I wish to use the dataset we have constructed earlier all i have to specify is the volume and name of the dataset as follows: -dv FDRAGON -d indoor_forward_3_snapdragon_with_gt.slam

  • Specifying algorithms is easier. When a volume for an algorithm is build, the name of the volume follows the following standard : <alg_name>-vol. As such when we specify an algorithm we specify it as a path of the form <alg_name>/<implementation_name>. For example if i want to run lsdslam , the cpp implementation i will pas -a lsdslam/liblsdslam-cpp-library.so

  • complete instruction for running lsdslam on the indoor_forward_3_snapdragon_with_gt dataset with visualisation is :

python3 starter.py run -t gui -dv FDRAGON -d indoor_forward_3_snapdragon_with_gt.slam -a lsdslam/liblsdslam-cpp-library.so

  • SLAMBench can take extra arguments on runtime. Going forward the plan is for those to be loaded from config-files. Currently the arguments can be passed at the end of the starter.py command , after the -- specifiers. python3 starter.py run -t gui -dv FDRAGON -d indoor_forward_3_snapdragon_with_gt.slam -a lsdslam/liblsdslam-cpp-library.so -- --log-file log.txt
⚠️ **GitHub.com Fallback** ⚠️