Experiments - uic-evl/DOE_DigitalTwin GitHub Wiki

Digital Twin Experiments

As we investigate the creation of a VR digital twin, we are testing many different methods. Here, we keep track of what is possible across OS platforms (Windows, Linux, Mac) and XR platforms (Unity, Unreal, Omniverse/Isaac Sim).

[!NOTE]
A blank field indicates an untested case.

Phase 1: Machine Set-Up

First, we need at least one machine of each OS that is capable of everything listed here.

Windows 11 Windows 10 Ubuntu Mac
Run Unity
Run Unreal
Run Omniverse
Supports VR/AR ⚠️ ALVR
Connect to Wifi
Run ROS/ROS2 ⚠️ Can only run ROS in WSL

Ongoing in this Phase

  • XR on Linux:
    • While it is possible through applications like ALVR, the performance is poor, and it is unclear what we can do to improve.
  • Acquire and test on Mac workstation

Phase 2: Communication and Networking

At a high level, our Digital Twin system will have a hardware component, simulation component, and user interface component. In this phase, we examine different ways of handling the network of these components, and the methods they use to communicate. We are interested in two overall methods: ROS2 networking middleware, or a custom networking implementation.

ROS2 Network Middleware

ROS2 supports two network middleware: Cyclone DDS and Fast DDS. Each of these implementations have a variety of configurable parameters, such as disabling multicast to force unicast communication.

Windows 11 Windows 10 Ubuntu Mac
Robot to machine
Robot to XR app ⚠️ Does not work with ROS2
Robot to machine ROS2
Machine ROS2 to XR app

Custom Networking

For our custom networking, we are testing two methods: ZeroMQ and RabbitMQ

Notes

  • Running ROS2 in Windows 11 requires WSL. ROS2 in WSL is unable to communicate with external hardware (aka, the robots). Simulation works fine.
  • We have only achieved connection to Create3s on our own wifi network, supplied by a minirouter.

Ongoing in this Phase

  • How does ROS2 networking work on Mac?
  • What other communications methods might work better? (TCP, UDP, wired, etc.)

Phase 3: Digital Twin

Once all of the basic communication infrastructure has been affirmed, we need to make sure we can actually visualize/simulate the robots in our VR app of choice. This step begins to incorporate broader considerations, such as deciding between direct models of the robot, or abstracted representations.

As we handle increasingly sophisticated robots, this step will include familiarizing ourselves with increasingly complex sensors and input data.

3A: Mirroring (Bot -> XR App)

Windows 11 Windows 10 Ubuntu Mac
Unity Unity Phase3 Create 3Sample Scene
Unreal ER Arm Guide
Omniverse ER Arm Guide

3B: Control (XR App -> Bot)

Windows 11 Windows 10 Ubuntu Mac
Unity Unity Phase3
Unreal ER Arm Guide
Omniverse ER Arm Guide

Notes

  • The Omniverse bidirectional method has only been tested with the ER arms.
  • The Unity application for Windows 11 has a functional ROS2 connection, but due to limitations with running ROS on Windows 11, can only mirror and control a simulated robot.

Ongoing in this Phase

  • Improvements to the Unreal VT UI for the Omniverse/Isaac Sim powered digital twin example.
  • Twinning Create3s in Isaac Sim

4: VAM-HRI

Once we are able to create a reasonable sophisticated digital twin of a given robot, we can begin to develop XR applications for human-robot interaction.

Windows 11 Windows 10 Ubuntu Mac
Unity Unity VR with ZMQ networking
Unreal
Omniverse