End Effector Pipeline - uic-evl/DOE_DigitalTwin GitHub Wiki
The steps below detail adding an end effector and ensuring operability. There is a demo file in . All that should be needed to be done is download Omniverse Files then run it in a nucleus and update paths in the python scripts as well as references to the scripts in the on nucleus.
PS please send video of this working and someone attempting to grab a test object
- Open the .usd that has your desired end effector and locate the reference .urdf in the properties panel on Isaac Sim editor.
- Open the .urdf in your desired editor using the locate or the pencil icon.
- Make note of the mesh and visual file paths
- In the .urdf locate <link name=[name first part of end effector]>
- Copy the link code starting at the first part of the end effector all the way to the last
- Paste in the .urdf of the desired robot. you may have to change mimic joints to revolute joints.
- Note: The origin and roll pitch and yaw of each object within the link are relative to the location of their parent. It is not suggested to directly edit .urdf to change the orientation of the end effector. Instead find an already existing .urdf with the correct orientation and use code directly from that .urdf in the desired robot .urdf .
- In a new scene on the top ribbon click Isaac Utils->Workflows->URDF Importer
- You should now have a parent xform with the name of the robot and the desired end effector. Its xform may have an orange or blue arrow, these suck. So in files -> Save flattened as… .
- Saving as flatten makes the .usd file heavier but more reliable than working with references and payloads.
There are a few steps to ensuring a good simulation of the End effector. The test will determine the functionality of the end effector simulation. Ideally, only the first test would be needed as its effectiveness should cascade. However, even simulations aren't ideal.
- In IsaacSim, in the to ribbon Isaac Utils -> Articulation inspector
- Check that there are names that the robot join names are there
- In the JointCommandArray click and hold on an input and slide left to right. The arm should wiggle.
- Repeat for each input to ensure that everything can move correctly in the simulation.
-
Connecting to arm:
- ensure the arm is using the ip address of the computer hosting OVIS
- ensure that they have the same arbitrary port
- ensure that both devices are connected to the same wifi
-
Connecting to Vicon:
- follow set up described on
- Ensure that scripts that use Vicon data use the IP address of the computer connected to the Vicon system
- ensure your import works(python): import os import ppath import sys sys.path.instert(0, Path(file to path name ))
The First thing that needs to be done is, an object needs to be created and set up to be Vicon detectable and it should have a corresponding object in the simulation. Briefly, I will explain how to create one and import it to OVIS.
It is possible to compare the pose of the real arm with the Vicon trackers and compare that to the simulation pose. However, questions like:
How accurate does the simulation need to be?
To test this, we need to ask a slightly different question: can the robot complete tasks properly? Simply put a task could be as simple as moving an object from an arbitrary point A to point B.
-
Measure real-world object dimensions
-
Design it in your preferred 3D modeling software. If you have never used one, Onshape and Shapr3d (for Ipad kids with apple pencils) are recommended thanks to their low barrier to entry and their cloud capabilities.
-
Export the object.
- It is easiest if you name the object before you export and have the same name and spelling across all devices. The naming convention is ET_.
- The easiest process I have found was to export from Shapr3D as an .stl file. In Omniverse on the top ribbon, File -> Import, there should be an .stl file type that is available. This type has seen more success in my brief use than .obj
- There already is an .STL of a samsung charger block <include picture, if not picture it is a standard samsung charger that I found in vis lab> that can be used. The demo scene should already have one implemented.
-
Import the object into OVIS and size as needed.
- There will probably be a need for some attribute editing (in the property tab of said object) to enable Vicon systems and scripts utilize the object. Follow these steps if you don't see a Rotate field in the properties tab.
- If the object has orient on it, right click to delete the Orient TransformOp.
- Repeat for any other TransformOps besides Translate and Scale.
- At the top of the property tab, click Add -> TransformOp -> Rotate
- There will probably be a need for some attribute editing (in the property tab of said object) to enable Vicon systems and scripts utilize the object. Follow these steps if you don't see a Rotate field in the properties tab.
-
Next add reflective tracking markers to your object in whichever way you see fit just make sure that there is some non-uniformity so it's easier to discern when an object has been rotated. 1. If you are using reflective stickers it may be necessary to cut them and combine cutouts to make a surface area that is easier for the Vicon system to track. 2. If you choose to cut out stickers, try to create cuts and combine the stickers like the pictures below:
-
In the Vicon tracking software, check if the system is able to pick up your test object and all of the stickers.
-
If the Software shows your object, then create an object following the directions in the pictures below (on phone).
- Make sure the object name is the same as the object that you just created.
- The screenshots are for Vicon Tracker 3.6 from the User guide. Use the User Guide for whichever version of Vicon you are using.
-
Make note of the IP address of the computer that is hosting the Vicon tracker. The Vicon tracker will broadcast object positions.
-
When you play the scene you should be able to move the real object and see the simulated object reflect those movements.
-
Ensure Vicon test objects are created using directions above.
-
Mark a starting location point A and an ending location point B.
- It most likely necessary to have point A and point B Vicon visible
-
FOR OMNI ONLY ARCHITECTURE: Ensure the .usd file that you want to use is uploaded on to the nucleus. For Networking ease of uses make sure any scripts that utilize networking are the local hard drive of your computer.
-
Open Isaac Sim and USD composer.
-
In omniverse open the .usd in the nucleus and start the live sync session(include picture).
-
Test that scene works
- Arm should go to target cube and cube should be green.
- The long rectangle should have blue and purple options that open and close the gripper.
- The test objects and real arm objects should move so long as your computer is on the same wifi.
- If the physical arm is on wifi it should be reflecting the movement the simulated arm makes, but may have some weird jittery movements.
- vr_ur.py should be stored locally on the computer and the .usd in the nucleus should use the locally stored script. This circumvents odd networking so long as zmq imports properly.
-
In IsaacSim copy the join session link. Join the session with the drop live sync session down and copy the session link.
-
In USD composer select RTX- Interactive in the top left corner. a. Repeat the same thing in Isaac sim. b. This may not be necessary at times if using Oculus 2 with updated quest link drivers and what not. Was Unsure what the requirements for proper VR usage were. Things were finicky during testing.
-
In the USD composer in the top ribbon select Window -> Rendering->VR to open the panel to turn on the VR.
-
Wake up the headset and enable the quest link and connect to the host computer.
-
Start steamVR.
-
Select start vr in USD composer.
- Place the test object within reach of the arm
- Start VR
- Start screen record either on VR headset or host computer
- Start the digital twin (IsaacSim and USDComposer)
- Set up a real world video recorder, make note of orientation. Analysis works best if general directions like “Left” in the simulation are the same as “Left” in the real world.
- Grab the object from marked location A.
- Move the object to marked location B.
- Release the object.
- Grab object from location B and move it to location A.
- Save the output files
- Pyzmq not compiled in backend See this issue:(include image, note that this was finicky on curiosity, include the github solution and note that their solution was not helpful )
- Lines in existing scripts (sys.path.instert)
- Ensure that its installed on path to python should look like path in insert
- Try to cycle through zmq installations
- Pip show zmq
- Pip show pyzmq
- Pip install –upgrade pyzmq [note that it is two -]
- Pip uninstall zmq
- Pip install zmq
- Ensure that these are done before upgrading past zmq 26.0.3
- Path too long? https://www.youtube.com/watch?v=obJmcid_erI
- Restart the computer
- Shut down the computer
- If all else fails: cry, report the issue to the pyzmq github, then go home and try tomorrow.
- Moving wrong things in VR
- Add attribute called(has to be named exactly this, case sensitive) xr:disable_grab and make it a bool and varying.
- Add this to the world and to objects you want to move
- Make sure it is checked on parent xform
- Make sure it is unchecked on objects you want to interact with
- Create the object at 0,0,0 in Vicon for smaller objects
- Save the file with the same name and as a flatten to get rid of references
- Blue arrow on xform means payload
- I means instanceable
- Orange arrow on xform means reference
- Both arrows suck, save as flatten
- Source: https://forums.developer.nvidia.com/t/list-of-icons/202180/6
- Act of god or otherwise unmanageable issue
- cry
- Restart
- VR not connecting
- Restart headset
- Connect to wifi
- Enable quest link
- Restart steamvr then connect to computer
- Start vr in USD composer
EE Name | Functions | Imports | Different Modes |
---|---|---|---|
myCobot_gripper_ag | mc = MyArm('/dev/ttyAMA0', 115200) | From pymycobot.myarm import MyArm
|
Gripper type: 1. Adapt grip 2. 5 finger hand 3. // gripper 4. Flex grip |
mc.set_gripper_state(state, speed, gripper type) | See above | State: Open - 0 Close - 1 Release - 254 |
|
mc.get_gripper_value(1) | see above | Sets gripper type, see information in second row | |
mc.set_gripper_value(arm rotation, speed, gripper type) | See above | Arm rotation range: [0, 100] Speed Range: [0, 100] |
|
mc.set_gripper_calibration() | See above | description TBA | |
mc.is_gripper_moving()? | See above | description TBA | |
Suction cup | Function | import | Modes |