ROS Interfaces in Robosuite - learningLogisticsLab/binPicking GitHub Wiki

robosuite.ai is a nice toolsuite that runs on top of the mujoco (www.mujoco.org) simulator. All information about the robosuite toolsuite can be found at www.robosuite.ai.

The toolsuite provides a set of tools to more easily create tasks across a number of robots, grippers, arenas, objects, and tasks. Furthermore, it gives you powerful tools to do different kinds of randomizations (domain, physics), do sensor corruption in simulation to assimilate a real environment; facilitates teaching by demonstration and more.

The toolsuite is very well designed and is completely modularizable. Robosuite.ai is compatible with OpenAI gym, which facilitates running DRL algorithms on the models.

The goal of this project is to maintain the same skeleton of the toolsuite simulation framework (https://robosuite.ai/docs/simulation/robot.html), but in cases where we want to run the code with the real robot instead of the mujoco simulator, be able to do it seamlessly.

The aim is to connect to the real robot via ROS (www.ros.org). Currently, we will work with ROS 1.0 Noetic on Ubuntu 20.04. More specifically, it will be to create an alternative set of internal functions to robosuite.ai that run ROS appropriate nodes/services/actions when using a real robot.

To start, the panda robot can be used (including the simulated Panda ROS version). There currently exists a gym-like interface that uses ROS on Panda. This would be a nice starting place to understand how to run ROS on Gym. To find more about this see: https://github.com/learningLogisticsLab/panda_robot

When we program ROS code, we want to abstract away publications/subscriptions inside clases/constructors and simply be able to call simple methods to achieve different functionality. A good example of this is baxter's api by rethinkrobotics. For example, in this 'Hello World' example, look at how you can instantiate the limb class and then simply call a method to (i) get the arm's joint angles or (ii) move the arm to a desire location. The API for baxter's system is found here. Please verify if this kind of abstraction already exists in Panda. Otherwise, we will create it and use it to communicate with Panda. Including any associated sensors like cameras/FT sensors.

Examples of RL Interfaces with ROS

Also Study QT-OPT, after which our work is inspired

What would you replace:

  • All sensor observations: get from ROS

  • All commands to move (i.e. step() methods)

  • These methods would tend to be inside -- step(): take action and return observations -- reward(): analyze the robot/scene via cameras/sensors and return reward -- reset(): move the robot, analyze the scene.

Roadmap

  1. Do a presentation on the software organization of the ROS/GYM interface at: https://github.com/learningLogisticsLab/panda_robot
  2. Do a presentation on the software organization of robosuite.ai
  3. Chart a roadmap of how robosuite.ai will be adapted to work with real robots without significant modifications to the robosuite.ai interface.
  4. Set milestones and report for each of these.