Workflow - UTNuclearRobotics/utexas_sterling GitHub Wiki

The files linked are on a nchan branch under a forked NRG repository.

Data collect rosbag

Topics

  • /imu/data
  • /kinect_imu
  • /camera/rgb/image_raw/compressed
  • /odom

Preprocess rosbag

rosbag_data_recorder.py

Converts rosbag into a pickle file with data:

  • Images: Processed images or bird's-eye view (BEV) images.
  • IMU Data: Flattened IMU data from both Kinect and Jackal sensors.
  • Odometry Data: Position and orientation data from odometry.
  • Orientation Data: Orientation data from the Jackal robot.

Train model

models.py

Defines the PyTorch models.

train_naturl_representations.py

Converts pickle file into representation clusters:

  • Saves the k-means clustering model.
  • Saves the k-means clustering labels and sample indices.
  • Saves the state dictionaries of the visual and proprioceptive encoders.

train_naturl_cost.py

Maps from representations to raw cost values:

  • Saves the PyTorch neural network model
    • Combination of the visual encoder and the cost network
    • Saved as a state dictionary, which includes the parameters (weights and biases) of the neural network.

Robot deployment

terrain_evaluator.cc

// Determines cost of potential paths based on parsing RGB cost image.
std::shared_ptr<PathRolloutBase> TerrainEvaluator::FindBest(const std::vector<std::shared_ptr<PathRolloutBase>> &paths);

// Computes a scalar cost image from a birds-eye-view (BEV) image from 'cost_model_'.
cv::Mat1f TerrainEvaluator::GetScalarCostImage(const cv::Mat3b &bev_image)

navigation.cc

// Loads the 'evaluator_' for graph navigation.
if (params_.evaluator_type == "terrain") {
  auto terrain_eauto terrain_evaluator = std::make_shared<TerrainEvaluator>();
  terrain_evaluator->LoadModel();
  evaluator_ = terrain_evaluator;
}
// Gets best candidate PathRolloutBase path.
auto best_path = evaluator_->FindBest(paths);

amrl spot_autonomy

Used to deploy the graph navigation stack on robot: