Mapping Module API Documentation - uiuc-iml/Robotic-Perception-Box GitHub Wiki

Function Reference

load_segmentation_model

load_segmentation_model(name, binary_data)

Loads a user-specified ONNX segmentation model into the Perception Box for future use.

Description

This function allows users to upload and register a custom semantic segmentation model in ONNX format. The model is stored under the name provided, and validated to ensure it can be loaded correctly using ONNX Runtime. The model is saved to the onnx_models/ directory as <name>.onnx.

Parameters

  • name (str):
    The name to save the model under. This name will be used later to reference the model during mapping.
  • binary_data (Binary):
    A binary-encoded ONNX file, typically sent using xmlrpc.client.Binary(open("model.onnx", "rb").read()).

Returns

  • str: A message indicating whether the model was saved and loaded successfully.

Example Usage

import xmlrpc.client
perception_box = xmlrpc.client.ServerProxy("http://10.192.142.59:5001")
with open("esanet.onnx", "rb") as f:
    model_binary = xmlrpc.client.Binary(f.read())
response = perception_box.load_segmentation_model("esanet", model_binary)
print(response)

Additional Notes

  • Please refer to the ONNX segmentation model transfer documentation for more information about making .onnx files for your segmentation models.
  • After loading, the model can be used in start_mapping(onnx_model_name="esanet").

list_onnx_models

list_onnx_models()

Lists all ONNX segmentation models currently available in the Perception Box.

Description

This function returns a list of all user-loaded ONNX model names stored in the onnx_models/ directory. These models can be referenced by name when starting a mapping session with semantic integration.

Parameters

None

Returns

  • list of str: Model names (excluding the .onnx extension).

Example Usage

models = perception_box.list_onnx_models()
print(models)  # Output: ['esanet_preproc', 'deeplabv3', ...]

Additional Notes

  • If no models are present, the returned list will be empty.
  • Only files ending with .onnx are included.

delete_onnx_model

delete_onnx_model(name)

Deletes an ONNX segmentation model previously stored on the Perception Box.

Description

This function deletes the model file named <name>.onnx from the onnx_models/ folder. This is useful for cleaning up unused models or replacing outdated ones.

Parameters

  • name (str):
    The name of the model to delete (without .onnx extension).

Returns

  • str: A status message indicating whether the model was deleted or not found.

Example Usage

response = perception_box.delete_onnx_model("esanet_preproc")
print(response)  # Output: "Model 'esanet_preproc' deleted." or "Model 'esanet_preproc' not found."

Additional Notes

  • This operation is irreversible. Use with caution.
  • If the specified model does not exist, an appropriate message is returned.

start_mapping

start_mapping(integrate_semantics=True, color=False, voxel_size=0.05, res=8, initial_num_blocks=17500, onnx_model_name=None)

Starts the 3D semantic mapping process for the Perception Box.

Description

This function initializes the mapping process by setting up the necessary threads and starting the mapping module. If semantic integration is enabled, it incorporates semantic labels into the map. Additionally, the function allows for the optional integration of color information. Users can configure core reconstruction parameters such as voxel size, voxel block resolution, and initial number of blocks. The mapping backend is based on the VoxelBlockGrid implementation from Open3D.

Parameters

  • integrate_semantics (bool, optional, default: True):
    Enables or disables semantic label integration. When True, semantic labels are processed and integrated into the map.
  • color (bool, optional, default: False):
    Enables or disables the integration of color information in the map.
  • voxel_size (float, optional, default: 0.05): The physical size of each voxel in meters. Smaller voxels increase reconstruction detail but require more memory.
  • res (int, optional, default: 8): Resolution of each voxel block. Each block contains res³ voxels.
  • initial_num_blocks (int, optional, default: 17500): Initial number of voxel blocks to allocate memory for in the VoxelBlockGrid. Can be tuned based on expected map size and available memory.
  • onnx_model_name (str, optional, default: None): Name of the ONNX model you want to use for semantic segmentation. Make sure to load and name the model beforehand using 'load_onnx_model()'.

Returns

  • str: A status message indicating whether the mapping task has started successfully or is already running.

Example Usage

import xmlrpc.client
perception_box = xmlrpc.client.ServerProxy("http://10.192.142.59:5001") 
# local network IP and port number you're using
status = perception_box.start_mapping(integrate_semantics=True, color=True)
print(status)  # Output: "Task started" or "Task is already running"

Additional Notes

  • This function initializes a semantic segmentation model if integrate_semantics is True. The reconstruction parameters such as depth_scale, depth_max, res, voxel_size, and n_labels can be specified in the yaml file.
  • The number of voxel blocks (initial_num_blocks) sets the initial memory allocation. If the grid fills up during mapping, the system may attempt to double the number of allocated blocks, which increases memory usage. Plan accordingly based on GPU or system constraints.
  • Two threads are started - One handles input for frame data from the SLAM module and the other executes the mapping process.

stop_mapping

stop_mapping() 

Stops the 3D semantic mapping process for the Perception Box.

Description

This function halts the mapping process by stopping all active threads and releasing resources associated with the mapping task. It ensures a clean shutdown of the mapping module to prevent resource leaks and deletes the map. You may instead use 'get_map_stop_mapping()', which also returns the map.

Parameters

  • None.

Returns

  • str: A status message indicating whether the mapping process was successfully stopped or if it was not running.

Example Usage

import xmlrpc.client
perception_box = xmlrpc.client.ServerProxy("http://10.192.142.59:5001") 
# local network IP and port number you're using
status = perception_box.stop_mapping()
print(status)  # Output: "Mapping stopped successfully" or "No mapping is currently running"

Additional Notes

This function ensures that all threads and resources are cleaned up when the mapping task is stopped. If no mapping process was running, it will notify the user that the task is not active.


pause_mapping

pause_mapping() 

Pauses the 3D semantic mapping process for the Perception Box.

Description

This function temporarily halts the mapping process by stopping the addition of new frames to the mapping module's queue. The SLAM module continues to operate normally while the mapping process is paused.

Parameters

None.

Returns

str: A status message indicating that the mapping process has been paused.


resume_mapping

resume_mapping()

Resumes the 3D semantic mapping process for the Perception Box.

Description

This function resumes the mapping process by allowing new frames to be added to the mapping module's queue. It reverses the pause state set by pause_mapping.

Parameters

None.

Returns

str: A status message indicating that the mapping process has been resumed.


get_semantic_map

get_semantic_map(map_type="pcd", top_label=True)

Retrieves the semantic map from the mapping process, either as a point cloud (pcd) or a mesh.

Description

This function extracts the semantic map from the reconstruction object. If the map type is "pcd", it returns the points and semantic labels. If the map type is "mesh", it returns the vertices, triangles, and semantic labels. The labels are either the top label per voxel or a full array of class probabilities depending on the top_label parameter.

Parameters

map_type (str, optional, default: "pcd"): The format of the map to retrieve. Must be either "pcd" or "mesh".

top_label (bool, optional, default: True): If True, the labels are reduced to the top label per voxel or vertex. If False, the function returns the full semantic probability vectors.

Returns

For map_type="pcd": A dictionary containing: points (list): List of point coordinates. labels (list): List of semantic labels or probability vectors for each point.

For map_type="mesh": A dictionary containing: vertices (list): List of vertex coordinates. triangles (list): List of triangles connecting the vertices. labels (list): Semantic labels or probability vectors for each vertex.

str: An error message if semantics were not integrated or if the reconstruction object is not initialized.

Additional Notes

Ensure integrate_semantics was set to True when starting the mapping process. Otherwise, this function will return an error. For "pcd", labels correspond to the points. For "mesh", labels correspond to the vertices.


get_metric_map

get_metric_map(map_type="pcd")

Retrieves the metric map from the mapping process, either as a point cloud (pcd) or a mesh.

Description

This function extracts the metric map from the reconstruction object. If the map type is "pcd", it returns the points and their associated colors (if enabled). If the map type is "mesh", it returns the vertices, triangles, and their associated colors (if enabled). If color integration was disabled during the mapping process, the colors will be returned as None.

Parameters

map_type (str, optional, default: "pcd"): The format of the map to retrieve. Must be either "pcd" or "mesh".

Returns

For map_type="pcd": A dictionary containing: points (list): List of point coordinates. colors (list or None): List of RGB color values for each point, or None if colors were not integrated.

For map_type="mesh": A dictionary containing: vertices (list): List of vertex coordinates. triangles (list): List of triangles connecting the vertices. colors (list or None): RGB color values for each vertex, or None if colors were not integrated.

str: An error message if the reconstruction object is not initialized.

Additional Notes

Ensure color integration was enabled when starting the mapping process; otherwise, colors will be returned as None. For "pcd", colors correspond to the points. For "mesh", colors correspond to the vertices.

⚠️ **GitHub.com Fallback** ⚠️