vrx_2019 task_tutorials - osrf/vrx GitHub Wiki
For each of the individual tasks in the 2019 VRX competition we provide examples of simulation worlds and Gazebo plugins to evaluate and score task performance. Instructions for running these examples are given below. Please see the Description of Tasks and Technical Guide documents for descriptions of each task, its application interface (API) and the scoring.
The individual tasks are described in the competition documents posted on the Documentation Wiki
After launch, all examples given below should begin with the wamv floating on the water near the shore, as shown:
Additional course elements will vary from task to task.
The vrx_gazebo/Task status message includes:
- task state {Initial, Ready, Running, Finished}
- current score
- timing information
Task status is published to /vrx/task/info
(further details of the API are in the Technical Guide). We recommend that you monitor the task status during simulation. One way to do this, for example, is to run:
rostopic echo /vrx/task/info
In preparation for developing an automated solution, we recommend using a gamepad or keyboard to drive the USV through the course (see Driving tutorial):
- Gamepad:
roslaunch vrx_gazebo usv_joydrive.launch
- Keyboard:
roslaunch vrx_gazebo usv_keydrive.launch
The following quick start instructions walk you through the initial process of launching your environment and subscribing to any available task-specific messages.
Summary: Navigate to the goal pose and hold station. The best solutions will minimize the difference between the goal pose and the actual pose of the vehicle over the duration of the task.
- Start the example:
roslaunch vrx_2019 station_keeping.launch
- Subscribe to the task-specific topics provided by the stationkeeping scoring plugin:
- The station-keeping goal (given as a geographic_msgs/GeoPoseStamped message):
rostopic echo /vrx/station_keeping/goal
- The current position error values:
rostopic echo /vrx/station_keeping/pose_error
rostopic echo /vrx/station_keeping/mean_pose_error
- The station-keeping goal (given as a geographic_msgs/GeoPoseStamped message):
- For implementation details, see "4.1.1. Task 1: Station-Keeping" in the Competition and Task Descriptions, or refer to the stationkeeping scoring plugin. (Note that the scoring function and API have been modified slightly since the original VRX 2019 competition to reflect subsequent improvements.)
Summary: Navigate through each of the published waypoints, such that vehicle achieves, as closely as possible, the positions and orientations specified.
- Start the example:
roslaunch vrx_2019 wayfinding.launch
- Subscribe to the task-specific topics provided by the wayfinding scoring plugin:
- The list of waypoints (given as a geographic_msgs/GeoPath message):
rostopic echo /vrx/wayfinding/waypoints
- The current minimum errors achieved for each waypoint so far:
rostopic echo /vrx/wayfinding/min_errors
- The current mean of the minimum errors:
rostopic echo /vrx/wayfinding/mean_error
- The list of waypoints (given as a geographic_msgs/GeoPath message):
- For implementation details, see "4.1.2. Task 2: Wayfinding" in the Competition and Task Descriptions, or refer to the wayfinding scoring plugin. (Note that the scoring function and API have been modified slightly since the original VRX 2019 competition to reflect subsequent improvements.)
- Review the Waypoint visualization tutorial to configure waypoint visualization markers inside Gazebo for station keeping or wayfinding tasks.
Summary: In this task, the vehicle remains in a fixed location and markers will appear in the field of view (see this video for a demonstration of the expected behavior). The objective is to use perceptive sensors to identify the markers and report their locations.
- Start the example:
roslaunch vrx_2019 perception_task.launch
- View the camera feeds from the front of the WAM-V:
rosrun rqt_gui rqt_gui --perspective-file ~/vrx_ws/src/vrx/vrx_gazebo/config/front_stereo.perspective
- Trials will begin. Identify the type and location of the markers that appear during each trial.
- Publish landmark identification and localization solutions as a geographic_msgs/GeoPoseStamped message to the
/vrx/perception/landmark
topic:rostopic pub -1 /vrx/perception/landmark geographic_msgs/GeoPoseStamped '{header: {stamp: now, frame_id: "red_mark"}, pose: {position: {latitude: 21.30996, longitude: -157.8901, altitude: 0.0}}}'
- Submission criteria:
- Each trial will last for 5 seconds.
- Solutions must be submitted before the end of the trial.
- Only the first submission for each trial will be considered.
- For further details, including a table of 3D objects that may appear during trials, see "4.2.3. Task 3: Landmark Localization and Characterization" in the Competition and Task Descriptions, or refer to the perception scoring plugin.
Summary: Traverse a navigation channel specified by red and green markers, avoiding obstacles.
- Start the example:
roslaunch vrx_2019 navigation_task.launch verbose:=true
- There are no ROS topics specific to this task. However, relevant Gazebo messages such as "New gate crossed!" will be printed to the terminal.
- For further details, see "4.3.1. Task 4: Traverse Navigation Channel" in the Competition and Task Descriptions, or refer to the navigation scoring plugin.
Summary: Given multiple docking bays, choose the correct one, dock safely, then exit the dock.
There are two variants of this challenge.
- In the first variant the correct dock is specified via a ROS message.
- In the second variant, the correct dock must be deduced from the Scan-the-Code sequence.
Below are succinct steps to run the examples. For more details on how successful docking is determined/scored and debugging suggestions, see Docking Details
- Start the example:
roslaunch vrx_2019 dock.launch verbose:=true
- Subscribe to the ROS topic that specifies the color and shape of the placard on the target docking bay:
rostopic echo /vrx/scan_dock/placard_symbol
- Dock in the bay displaying the symbol published by to the
placard_symbol
topic.
If you would like to change the color and shape of the symbols there are a couple of options.
- If you have installed the VRX source, you can change the designated "correct" docking bay color and shape:
- Edit the
vrx_gazebo/worlds/dock.world.xacro
file.- Look for the
<bay>
tags and find the section with<dock_allowed>true</dock_allowed>
- Change the strings in both the
<announce_symbol>
and<symbol>
tags to your desired color and shape, e.g.,green_triangle
.
- Look for the
- Rerun
catkin_make
to process the xacro file. - Restart the example. You should now see the target bay marked with the desired placard and you should see that the same
<COLOR>_<SHAPE>
specification is published on the/vrx/scan_dock/placard_symbol
.
- Edit the
- Alternatively, if you would just like to change the color and shape without affecting the scoring plugin (which is how the system knows which is the "correct" bay), you can publish a ROS message to change the color and shape to a new random selection:
rostopic pub /vrx/dock_2018_placard1/shuffle std_msgs/Empty "{}"
- Start the example:
roslaunch vrx_2019 scan_and_dock.launch verbose:=true
- View the camera feeds from the front of the WAM-V:
rosrun rqt_gui rqt_gui --perspective-file ~/vrx_ws/src/vrx/vrx_gazebo/config/front_stereo.perspective
- Approach the scan-the-code buoy and identify the sequence of three colors displayed.
- Transmit the correct sequence to the color sequence server:
rosservice call /vrx/scan_dock/color_sequence "blue" "red" "green"
- Notes:
- The service name is
/vrx/scan_dock/color_sequence
- Allowable values are "red", "green", "blue" and "yellow"
- The service only returns once, that is you can only call the service one time. The first time you call the service it returns true, even if the sequence is incorrect. If the sequence is correct, the score is incremented by the colorBonusPOints value (typ. 10 points). You can monitor the score with
rostopic echo /vrx/task/info
- The service name is
- Dock in the bay displaying the symbol that corresponds to the correct color sequence.
For further details on either task, see "4.3.2. Task 5: Scan-the-code and Dock" in the Competition and Task Descriptions, or refer to the scan and dock scoring plugin