Minitask 2 - RobotTeaching/COMP4034 GitHub Wiki
In this task, you will implement basic robot behaviours on your Turtlebot Waffle using finite state machines (FSMs). By leveraging laser scan data from the robot’s LIDAR, you will enable the TurtleBot to navigate its surroundings, avoid obstacles, and follow walls. This exercise will enhance your understanding of behavioural robotics and get you closer to completing your assignment.
Task 1: Setting up your environment
Follow the steps below to get ready for this week's activities:
- Create a package for
minitask2
- Create a file called
behaviours.py
and make it executable - Source your workspace and set your TurtleBot model to
waffle
. If you're unsure how to do this, refer back to Minitask-0. Please skip this step if you are using a lab machine or have already updated your.bashrc
file. - Start the Gazebo simulation with the TurtleBot World environment using the command:
roslaunch turtlebot3_gazebo turtlebot3_world.launch
- You may want to use RViz to observe what your robot "sees":
roslaunch turtlebot3_gazebo turtlebot3_gazebo_rviz.launch
- This RViz environment is preconfigured to display all relevant topics, but feel free to customise it according to your preferences. See Resource 2 below to learn how to configure RViz.
- In RViz, go to the Laser Scan display and increase size to
0.05 m
to make the scan clearly visible.
Task 2: Implementing Behaviours
Your robot is asked to perform three behaviours:
- Obstacle avoidance: If TurtleBot is faced with an obstacle closer than
0.3m
, it should stop and rotate until no obstacle is in close distance. - Right-hand wall following: If TurtleBot observes a wall on the right hand side, it should follow the wall.
- Random walk: In all other cases, TurtleBot moves forward 1 meters, randomly chooses a direction and rotates to continue the random walk. This behaviour loops forever, TurtleBot moving 1 meters forward and rotating toward a random direction at each iteration.
Think about how these behaviours should override each other. Which is the most important behaviour? Why?
Notice that the behaviours should inhibit each other. An easy way to implement this is through using a finite state machine (FSM) like discussed in the lecture. In your FSM, each behaviour should be implemented separately and act as a single state.
For implementing these behaviours, your robot should use sensor data. In this minitask we will use laser scan data published in topic /scan
. You should see these data in RViz under Laser Scan display (don't forget to increase size to 0.05 m to make the scan clearly visible).
You will need to complete the following parts to understand everything you need for the robot to perform these behaviours. When you've finished Part 3, come back here and look at the behaviours again. How are you going to implement them?
Part 1: Understanding laser scan data
For implementing these behaviours, your robot should use sensor data. The TurtleBot is equipped with a LIDAR sensor that provides 360-degree range information. This data are published on the topic /scan
. You can view this data in RViz, and to view a single scan message, use the following command in your terminal:
rostopic echo /scan -n1
- What do you think the flag
-n1
does? - Can you find out more information about the topic by running
rostopic info /scan
? - What is the message type for
/scan
? Refer to Resource 1 to learn more about this message type. - After verifying that you'll be working with a LaserScan message type, run
rosmsg show sensor_msgs/LaserScan
to see its structure.
Your laser scanner captures a 360 degree scan around the robot in terms of ranges captured by a multitude of scan lines. This information can be used to detect obstacles around the robot.
scan
topic and working with LaserScan messages
Part 2: Subscribing to the Inside your behaviours.py
file, which you created in Task 1, perform the following tasks:
- Create a Publisher that publishes to the
/cmd_vel
topic. This is needed as you will need to publishTwist
messages appropriately for your implemented behaviours. - Create a Subscriber that subscribes to the
/scan
topic. - Implement a callback function to process the laser scan data. You will be interested in the
ranges
array, which contains the distance to obstacles for each laser scan line.
- What should be the type of the
msg
parameter in your Subscriber? - A value of
Inf
within theranges
array normally indicates that there is no obstacle withinrange_max
(if you haven't already done this, see the member variables of a LaserScan message). Before deploying your code onto the robot, verify if this is the case. You may be surprised to see boundary values may differ between simulation and the real robot :)
In the specific case of our Turtlebot, we have 360 float32
range values, which are stored in the ranges
variable. The indices for the array indicate the direction of the scan line and will always be aligned with the base frame of the robot. For example, LaserScan data in the front, right, and left directions of the robot will be respectively stored in
ranges [0]
(front)ranges [270]
(right)ranges [90]
(left)
Part 3: Not crashing into things
Arguably, not crashing into things is the most basic behaviour a robot should have. Edit your code so that your robot drives forwards until it is close to an object, at which point it stops moving. Hints:
- You will need to decide how 'close' is too close; the values are in metres
- Using a single scan point might miss the object... How about using a range of values from the scan data instead? You can also work in a small window within the ranges and use the minimum or average/median distance to obstacles within those ranges.
Now, go back to the top of Task 2 and look at the behaviours you've been asked to implement. How can you use what we've covered here to do that? What will the finite state machine look like? And finally, can you implement all the behaviours so they work together?
Resources
Resource 1. Laser scan message
Resource 2. RViz user's guide