Future Projects - jhu-dvrk/sawIntuitiveResearchKit GitHub Wiki
For all projects that impact the dVRK software stack (i.e. sawIntuitiveResearchKit or dvrk-ros), proper tickets/issues should be created on github.
Control
MTMs grippers
The current calibration process uses the fully open angle and lightly closed angles. We could also use the fully closed angle and maybe use a non linear mapping between the master grippers and tool's jaws. Use both Hall Effect sensors, we need to use one of the digital outs to toggle the mux. See #23 After 1.7, mapping is a bit different. The teleop code now gets the max angle for the PSM jaws and MTM grippers. It then find the scale from 0 to max between PSM and MTM. This allows to open fully and close as needed. Negative angles (applying more torque) use the positive scale. There is still room for improvement here. [Improvement, core C++]
Encoder/potentiometer redundancy
Includes better filtering, maybe including some kind of delay to take into account the fact that the potentiometers are slow. Filtering on the PC side tends to introduce more delay so maybe add filtering on the FPGA. The current (simplistic) implementation is based on total of successive failures. In release 1.6 we've introduced a new test based on distance and latency with parameters set per joint and per arm in XML configuration file. This is not ideal but should be sufficient for most users. Figure out how to use potentiometer on master roll (last active joint). See #25. [New feature, core C++]
Velocity control
Add special mode to PID to servo torque based on current velocity or any better approach. In arm class, add code to convert cartesian velocities (likely in body frame) to joint velocities and use PID interface to control joint velocities. Add ROS topics for this new feature. [New feature, core C++]
Better use of MTM redundancies
Positioning the wrist in effort mode, see #2 Taking advantage of the symmetry of the master gripper and maximize the joint space overlap with PSM, see #56. After 1.7, code was introduced to optimize the 4th joint of MTM (platform) when in free moving mode. This is still work in progress for IK. [New feature, core C++]
Better PSM teleoperation
Take joint PSM limits into account and provide force feedback on MTMs. Evaluate a different control, we currently use absolute positions since the follow mode started. Maybe using relative motions since last command (incremental positions and/or velocities) would reduce the likelihood of jumps in the cartesian space. [Improvement, core C++]
Trajectory generation
We should add support for trajectories in cartesian space, maybe interpolation position/Euler angles. [Improvement, core C++]
Support for ROS MoveIt!
We could provide an interface accepting ROS Trajectory messages (http://docs.ros.org/jade/api/trajectory_msgs/html/msg/JointTrajectory.html), we already have the joint_state
so that should be enough for joint space. I'm not totally sure what would be needed for cartesian trajectories but I assume a better integration with TF is required. The joint space trajectories shouldn’t be too hard to code on the C++ side, take a list of points (PT and/or PVT) and use the existing robQuintic code to implement a trajectory following mode. There will be a need to spec a cisstParameterTypes message and add the conversion method from ROS to cisst. An alternative solution would be to use a separate ROS node that would rely on the ros controller "FollowJointTrajectory", the communication with the dVRK can be done by re-implementing a hardware (HW) abstraction using ROS topics.
[New feature, core C++, ROS]
Video
RViz console
Create a virtual console with:
- Live or simulated video
- Maybe pro-view equivalent
- Icons for dVRK status: tool need re-align, clutch pressed, ...
- Custom 3D widgets, maybe text viewers, drop tags, ... [New feature, ROS/RViz, maybe C++ RViz module]
AR RViz console
Create an augmented reality (AR) console based on rViz, building on an REU 2022 project (GitHub link). The first phase of the project is to implement and evaluate the following two measurement tools:
- PSM-based measurement: Move one or more PSMs and use robot kinematics to indicate distances:
- If one PSM, can press a button or footpedal to place a reference marker and then display distance from that marker
- If two PSMs, can display distance between them
- MTM-based measurement: Press the clutch pedal to enter a "Masters as Mice" (MaM) mode, where the MTMs move cursors on the display:
- If one MTM, can press a button or footpedal to place a reference marker and then display distance from the cursor to that marker
- If two MTMs, can display distance between the two cursors
The advantage of the PSM-based measurement is that it does not require camera (endoscope) calibration, but the disadvantages are that the instruments must be moved inside the patient and that it relies on robot kinematics, which have some error. The advantage of the MTM-based measurement is that it does not move anything inside the patient, but it has the disadvantage that it requires calibration of the endoscope camera.
Once the AR console is implemented, it can be evaluated in a user study, where subjects are asked to measure lengths of 3D objects. As a baseline, the objects can be measured using a ruler; this may be difficult for many 3D objects.
Motorized 3D camera
For dVRK users (i.e. users who don't have a full da Vinci), alternative arm with stereo head. Maybe a simple pan/tilt camera with ROS interface controllable from MTMs? In any case, we should include a mechanism to report the position of the PSMs with respect to the camera. In the case of a pan/tilt, the orientation is really what we are after.
Applications
Graphical Launcher
Currently, it is necessary to specify a rather lengthly command line to start the dVRK. One alternative would be to have a graphical launcher. An initial implementation could be:
- Select the bus (Firewire or Ethernet)
- Scan the bus to identify which arms are connected (e.g., MTMs, PSMs)
- Display a GUI with radio buttons to select which arms to use
- Perhaps include options to select footpedals, etc. In the next stage, the launcher could be configured to include simulated robot arms (e.g., PSMs in AMBF). In that case, there could also be options to select a simulated scene, such as a simulated peg transfer or the surgical robotics challenge scene.
Potentiometer and DH calibration
We have a procedure to calibrate the scales but we're missing many offsets, the only one handled for now are the last 4 on PSMs. Can we find a way to calibrate the pots on MTMs, ECM and first 3 on PSMs? Develop a procedure to collect 3D positions both based on a tracking system (likely optical) and based on encoders and then identify the ideal DH parameters. JHU had a high school student working on similar project during summer '16. [New feature, ROS, ...]
Dynamic simulation
Two different goals:
- Offline simulation
- Real-time simulation, research skill simulator?
Gazebo
Gazebo reviews are mixed, some reported issues with stability and poor documentation but widely used in the ROS community, including for the DARPA project and at JHU with WAM arms. There seems to a fair amount of work done at WPI and some at JHU. There's a JHU summer '16 undergrad project ongoing. [New feature, Gazebo/RViz/ROS]
VRep
Some JHU users have experience with VRep. They've been happy with user support, stable API. Not as popular as Gazebo. [New feature, VRep/ROS]
Documentation
Test protocol
- Must compile code on a clean install, latest two LTS Ubuntu (e.g. 16.04 and 14.04)
- Test compilation with and without catkin/ROS
- Include examples with generic/derived arms in compilation
- Run Python/Matlab test scripts to make sure ROS topics have been updated on all ends
- Add test scripts checking basic transitions between joint/cartesian, add small random noise to PID measured positions in simulated mode to make sure we can differentiate measured vs commanded position