Minitask 2 - RobotTeaching/COMP4034 GitHub Wiki

Robot Behaviours

In this lab, you will program 3 behaviours and a state machine that will manage those behaviours. Your robot will be able to follow a wall, avoid obstacles, and perform a random walk to find a wall to follow.

The deadline for this minitask is 23:59 03/11/2025. You will need to demo your code on a robot to a lab TA to get full marks.

Before you begin

Pull the code skeleton from our GitHub and move it into the right place in your Docker setup. Have a look at the package structure and you might notice something...

❗ The file minitask2.py is actually called minitsak2.py! This is a deliberate mistake to help you explore the package structure. If you rename the file and rebuild, or try to run the code straight after renaming the file, you'll run into errors.

❓ Explore the package and find what other files make reference to minitsak2.py, and change them.

🤖 Let's talk code structure:

  1. When writing ROS2 code, you should keep your sensor callbacks to only processing the incoming sensor data. The logic of what you do with that sensor data should go elsewhere, e.g. into some kind of control method. For this code, you can start by using the timer callback as the controller.
  2. Each behaviour should be its own ROS2 node, in its own package. Consider this a stretch exercise for this task, and we'll cover this in more detail in the next minitask.

🤖 You may want to use the TurtleBot House map for this task, as it has more straight walls for your robot to follow. Launch it using ros2 launch turtlebot3_gazebo turtlebot3_house.launch.py

Goals

By the end of this minitask, you should be able to:

  • Have your robot avoid obstacles using the LIDAR
  • Have your robot following a wall using LIDAR
  • Implement 3 robot behaviours
  • Build a state machine so your robot can use its behaviours

Behaviour 1: obstacle avoidance with LIDAR

Task 1: using the skeleton to understand LIDAR messages

Firstly, start the TurtleBot simulation using ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py, then open up RQT using rqt.

In RQT, use the message type browser (see Minitask1 if you need a refresher) to find the message type LaserScan (take a guess at which message package it's in!)

There are three key parts of this message we're interested in:

  1. range_min
  2. range_max
  3. ranges

❓ What type are each of these variables?

❓ What is each variable representing?

🤖 Now, open up the skeleton code minitask2.py and find the callback method that processes the LIDAR data. There are some "unknown variables" you will need to name by experimenting with the robot in different locations: direction_1 to direction_4. You should name those variables something more descriptive.

(Hint: You should have found out that ranges is the returned distances for all LIDAR scans. The LIDAR scans 360 degrees, so ranges is [0] to [359]. This hint previously implied there were 365 degrees in a circle, whoops!)

IMPORTANT NOTE: Due to changes in hardware and settings, in some machines you may get less than 360 scan values (please check the length of your ranges array. If you have less than 360 values, you should look at your angle_increment, which is the angle increment between two consecutive scan lines, i.e. angle_increment x len(ranges) should return something close to 2PI. In such a case, you should work out the left/right/front/back directions as necessary using arithmetics.

❓ What LIDAR readings to you get when something is too far away for the robot to detect?

Task 2: driving forwards and not hitting things

Now that you know what the LIDAR data looks like, you can get your robot to drive forwards until it gets too close to something. Since the LIDAR data is in increments of 1 degree, one single reading won't be enough to establish if you're too close to something (since that single LIDAR scan might miss the object!).

❓ Decide on a range of scan points to consider when deciding if you're too close to something

❓ Decide on a sensible way to aggregate the data over multiple LIDAR points into a single reading

❓ How close is "too close"? The LIDAR returns readings in metres, so use that to make a good first guess, then experiment!

🤖 You should issue a Twist message to drive forwards, until the robot is too close to something. Then, publish a Twist with everything set to zero to stop the robot.

Task 3: avoiding obstacles

You understand the LIDAR data, you can stop the robot if it's too close to something... now it's time to avoid the object! This is broadly divided into 3 steps:

  1. Drive forwards while the space is clear
  2. Detect that an object is too close
  3. Turn until the space in front of you is clear

🤖 Write some code to implement basic obstacle avoidance. Hint: you'll need the information you found out earlier about what data the LIDAR returns when obstacles are too far away to be detected.

Behaviour 2: wall-following with LIDAR

For this behaviour, your robot should be able to decide if there is a wall on its left- or right-hand side and then follow that wall in the straightest line it can. It should also keep a sensible distance away from the wall to avoid collisions.

❓ How close is "too close" to the wall? Again, you should experiment - but it's probably slightly closer than your obstacle avoidance distance, as a good starting guess

❓ How can you keep your robot following in a straight line? You'll need to implement a PID controller, but try implementing just the Proportional controller first, rather than trying to tune the whole thing at once.

🤖 Implement wall-following. You should be able to make this work with your obstacle avoidance code pretty easily, but if it helps you can separate the code out and just work on one part at a time.

Behaviour 3: random walk

If your robot isn't following a wall, and it isn't currently avoiding an obstacle, what should it do?

There are lots of exploration strategies that you'll hear about in the lectures and can learn about from doing extra reading. The most basic exploration technique is a "random walk", where the robot travels a random distance forwards, then turns a random degree in a random direction.

❓ There's lots of things for you to experiment with and decide here: what's a good range for random movement forwards? For the angle to turn? Start with small amounts and experiment in simulation

❓ Your robot has positive and negative rotation, as we've mentioned previously. Which way turns left, which way turns right? How can you use this when deciding which direction to turn?

🤖 Implement your random walk behaviour.

Linking the behaviours with a state machine

Now you have three behaviours, how can you structure them so the robot is always doing something sensible? You will need to decide on:

  1. Which behaviours can activate other behaviours
  2. What priority the behaviours should have
  3. Identify any edge cases and think about how to mitigate them

The most simple way to do this is using a finite state machine (FSM), however there are other types of behaviour management (e.g. subsumption) that you've heard about in the lectures.

🤖 Choose a behaviour management technique to implement, and get everything working in simulation.

❓ There's some behaviour transitions for you to decide by yourself here - for example, what do you think the robot should do if the robot ends up following the wall around in a circle?

On the real robot

Quality of Service settings

QoS (quality of service) settings allow us to specify certain things about the sensors we're using. For the LIDAR, there's a mismatch between what the sensor can offer for its reliability ("best effort"), and what ROS wants in terms of its reliability ("reliable").

You'll need to add some code to change the QoS settings when you create your LIDAR subscriber, so that ROS accepts the LIDAR's best guess at the data. At the top of your script: from rclpy.qos import qos_profile_sensor_data, then replace the 10 in your subscriber as follows:

node.create_subscription(LaserScan, 'scan', chatter_callback, qos_profile_sensor_data)

Sim 2 Real gaps

We've talked about the "sim 2 real" gap in lectures - how, when you develop something in simulation, it doesn't immediately translate to the real world.

There's one very important sim 2 real thing you will need to explore and fix before you can put your final code onto the real robot, and that's checking that the LIDAR data from the real robot matches what you got from the simulated robot.

❓ How can you check this?

❓ Which readings in particular might be different between the simulation and real life?

If you are getting less than 360 LIDAR readings, you will need to figure out how many degrees each scan covers. You'll have one scan per 2pi/len(ranges) radians. The LaserScan message also contains a variable angle_increment in its header which tells you how many radians each scan represents. You'll need to check this in every new scan message.

You can see what the robot is "seeing" using a tool called RViz, which can be launched by using ros2 launch turtlebot3_bringup rviz2.launch.py. We'll be using this tool a lot going forwards, since it can show map data, LIDAR data, and camera data all through one interface. Spend some time getting familiar with it when you're testing your code on the real robot.

Resources

Tutorial on ROS2 packages

Tutorial on launch files