Heterogeneous robot: - fontysrobotics/ARMinor-2020-Opensource-Virtual-Learning-Environment-AROVLE GitHub Wiki

5.1 intro:

In this chapter we will create a description package of a heterogeneous robot. The robot consists of two main parts: a holonomic platform with four wheels and an arm with six degrees of freedom. This is the robot that will be used for the rest of the tutorials.

Note: It is not recommended to follow this tutorial before finishing the previous tutorials.

5.3 Gazebo plugins:

Now that we are familiar with the generated package from solidworks we can start adding some Gazebo plugins to the URDF file.

Before adding plugins let’s take a look at what Gazebo plugins actually are. According to gazebosim.org a gazebo plugin is a chunk of code that is compiled as a shared library and inserted into the simulation. The plugin has direct access to all the functionality of Gazebo through the standard C++ classes. Plugins are useful because they: let developers control almost any aspect of Gazebo are self-contained routines that are easily shared can be inserted and removed from a running system There are several types of gazebo plugins: World, Model, Sensor, System, Visual, and GUI 5.3.1 Ros planner Movement: Before starting with the planner movement plugin we will prepare the workspace and the URDF file to be used with xacros. This will simplify the modeling.

First navigate to urdf folder in the example_urdf package exported from solidworks: $ cd ~/catkin_ws/src/example_urdf/urdf

Rename the example_urdf.urdf file to example_urdf.urdf.xacro: $ mv example_urdf.urdf example_urdf.urdf.xacro

Create a file called example_gazebo.xacro: $ code example.gazebo.xacro

Now let’s edit the URDF file: After the main robot tag in the beginning of the file we will do the following: Include the gazebo.xacro file add an extra virtual link called base_footprint add a joint to attach the whole robot model to the link Create code regions

<!-- #region footprint -->

<link name="base_footprint" />

<joint name="base_joint" type="fixed">

<parent link="base_footprint" />

<child link="base_link" />

<origin xyz="0 0 0" rpy="0 0 0" />

</joint>

<!-- #endregion footprint-->

Now that we edited the URDF file, and changed its name, we should amend the launch file to reflect these changes.

<launch>

<include file="$(find gazebo_ros)/launch/empty_world.launch" />

<param name="robot_description" command="$(find xacro)/xacro '$(find example_urdf)/urdf/example_urdf.urdf.xacro'" />

<node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher" />

<node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" />

<node name="urdf_spawner" pkg="gazebo_ros" type="spawn_model" respawn="false" output="screen" args="-urdf -x 0 -y 0 -z 0 -model example_urdf -param robot_description" />

</launch>

Now we are ready to start using our first gazebo plugin! We want to use the object controller plugin to take commands from the cmd_vel topic, and control our robot based on the virtual base_footprint link.

To do this, add the following in the example.gazebo.xacro:

<?xml version="1.0" encoding="utf-8"?>

<robot name="example_urdf" xmlns:xacro="example_robot"> <!--#region Holonomic-->

<gazebo>

<plugin name="object_controller" filename="libgazebo_ros_planar_move.so">

 `<commandTopic>cmd_vel</commandTopic>`

 `<odometryTopic>odom</odometryTopic>`

 `<odometryFrame>odom</odometryFrame>`

 `<odometryRate>20.0</odometryRate>`

 `<robotBaseFrame>base_footprint</robotBaseFrame>`

</plugin>

</gazebo>

<!--#endregion holonomic-->

To test the planar plugin, let’s install the teleop_twist_keyboard package: $ sudo apt install ros-melodic-teleop-twist-keyboard

Now let’s launch the example_urdf package: $ roslaunch example_urdf gazebo.launch

Run the teleop keyboard: $ rosrun teleop_twist_keyboard teleop_twist_keyboard.py

You can control the robot movement through the teleop node, for holonomic movement hold shift. Now that we’ve added the planar plugin, let’s give our robot some colour!

5.3.2 Materials:

Adding a material is a pretty straightforward task in Gazebo. It’s done by referencing the link by its name, and defining a material from the list of predefined materials (found here).

In the example.gazebo.xacro, create a material region, and add a colour for the base_link. Feel free to pick your own colour from the linked list.

<gazebo reference="base_link">

<material>Gazebo/Red</material>

</gazebo>

5.3.3 Virtual sensors & data synthesis:

Now we will be looking at plugins that allow us to simulate sensors, and synthesize data to test our robot. Firstly, we will look at the ROS laser plugin, which allows us to simulate a LIght Detection And Ranging (LIDAR) sensor. Then we will take a look at the ROS Kinect plugin, which allows us to simulate a camera sensor..

5.3.3.1 ROS laser

In this section we will add the libgazebo_ros_laser.so plugin to our example.gazebo.xacro file in order to simulate two LIDARs. We will have to make a physical model for our LIDARs, and add the Gazebo plugin that simulates them afterwards.

First, let’s make the physical model for the LIDARs, in our example_urdf.urdf.xacro. We’ll make two cylinders with a length of 0.0315, and a radius of 0.02. Let’s put these links in their own region:

<!-- #region laser links-->

<link name="laser1">

<visual>

 `<origin xyz="0 0 0" rpy="0 0 0" />`

 `<geometry>robot`

   `<cylinder length="0.0315" radius="0.02" />`

 `</geometry>`

</visual>

</link>

<link name="laser2">

<visual>

 `<origin xyz="0 0 0" rpy="0 0 0" />`

 `<geometry>`

   `<cylinder length="0.0315" radius="0.02" />`

 `</geometry>`

</visual>

</link>

Now let’s join our two links to our base link:

<joint name="laser1_joint" type="fixed">

<origin xyz="0.1 -0.0 0.01" rpy="0 0 0" />

<parent link="base_link" />

<child link="laser1" />

</joint>

<joint name="laser2_joint" type="fixed">

<origin xyz="-0.1 0.0 0.01" rpy="0 0 3.14" />

<parent link="base_link" />

<child link="laser2" />

</joint>

<!--#endregion links-->

Now that we have the two models, we can add the laser plugin. To do so, add the following in the example.gazebo.xacro file:

<!--#region laser-->

<gazebo reference="laser2">

<sensor type="ray" name="fullradius">

 `<pose>0 0 0 0 0 0</pose>`

 `<visualize>true</visualize>`

 `<update_rate>40</update_rate>`

 `<ray>`

   `<scan>`

     `<horizontal>`

       `<samples>720</samples>`

       `<resolution>1</resolution>`

       `<min_angle>-1.770796</min_angle>`

       `<max_angle>1.770796</max_angle>`

     `</horizontal>`

   `</scan>`

   `<range>`

     `<min>0.10</min>`

     `<max>3.0</max>`

     `<resolution>0.01</resolution>`

   `</range>`

   `<noise>`

     `<type>gaussian</type>`

     `<mean>0.0</mean>`

     `<stddev>0.01</stddev>`

   `</noise>

 `</ray>`

 `<plugin name="gazebo_ros_head_hokuyo_controller" filename="libgazebo_ros_laser.so">    

scan2`

   `<frameName>laser2</frameName>`

 `</plugin>`

</sensor>

</gazebo>

<gazebo reference="laser1">

<sensor type="ray" name="halfradius">

 `<pose>0 0 0 0 0 0</pose>`

 `<visualize>true</visualize>`

 `<update_rate>40</update_rate>`

 `<ray>`

   `<scan>`

     `<horizontal>`

       `<samples>720</samples>`

       `<resolution>1</resolution>`

       `<min_angle>-1.770796</min_angle>`

       `<max_angle>1.770796</max_angle>`

     `</horizontal>`

   `</scan>`

   `<range>`

     `<min>0.10</min>`

     `<max>3.0</max>`

     `<resolution>0.01</resolution>`

   `</range>`

   `<noise>`

     `<type>gaussian</type>`

     `<mean>0.0</mean>`

     `<stddev>0.01</stddev>`

   `</noise>`

 `</ray>`

 `<plugin name="gazebo_ros_head_hokuyo_controller" filename="libgazebo_ros_laser.so">     

scan1`

   `<frameName>laser1</frameName>`

 `</plugin>`

</sensor>

</gazebo>

<!--#endregion laser-->

Notice that we can edit the plugin values to match what we are trying to achieve. For example, we can adjust the laser angle in:

       `<min_angle>-1.770796</min_angle>`

       `<max_angle>1.770796</max_angle>`

We can also change the true value to visualize the laser rays in gazebo. For more information about the plugin please refer to gazebo :tutorial page. Below is a screenshot from gazebo to the current state of the robot.

5.3.3.2 ROS Kinect

The final plugin we are adding to our robot is libgazebo_ros_kinect.so which will simulate a camera sensor. We will start by using xacros to define some global variables in our example_urdf.urdf.xacro to be used later in the link origin:

<xacro:property name="r200_cam_rgb_px" value="0.380" /> <xacro:property name="r200_cam_rgb_py" value="-0.07" /> <xacro:property name="r200_cam_rgb_pz" value="-0.031" /> <xacro:property name="r200_cam_depth_offset" value="0.03" />

Now we will add a link to serve as our camera body in our example_urdf.urdf.xacro, and join it to the base link, just as we did for the LIDARs. We will also use joints to define the positions of the different camera frames, to be used later:

<!--#region cam links--> <joint name="camera_joint" type="fixed"> <origin xyz="${r200_cam_rgb_px} ${r200_cam_rgb_py} ${r200_cam_rgb_pz}" rpy="0 0 0" /> <parent link="base_link" /> <child link="camera_link" /> </joint>

<link name="camera_link"> <visual> <origin xyz="0 0 0" rpy="1.57 0 1.57" /> <geometry> <mesh filename="package://example_urdf/meshes/r200.dae" /> </geometry> </visual> <collision> <origin xyz="0.003 0.065 0.007" rpy="0 0 0" /> <geometry> <box size="0.012 0.132 0.020" /> </geometry> </collision> </link>

<joint name="camera_rgb_joint" type="fixed"> <origin xyz="${r200_cam_rgb_px} ${r200_cam_rgb_py} ${r200_cam_rgb_pz}" rpy="0 0 0" /> <parent link="camera_link" /> <child link="camera_rgb_frame" /> </joint> <link name="camera_rgb_frame" />

<joint name="camera_rgb_optical_joint" type="fixed"> <origin xyz="0 0 0" rpy="-1.57 0 -1.57" /> <parent link="camera_rgb_frame" /> <child link="camera_rgb_optical_frame" /> </joint> <link name="camera_rgb_optical_frame" />

<joint name="camera_depth_joint" type="fixed"> <origin xyz="${r200_cam_rgb_px} ${r200_cam_rgb_py + r200_cam_depth_offset} ${r200_cam_rgb_pz}" rpy="0 0 0" /> <parent link="camera_link" /> <child link="camera_depth_frame" /> </joint> <link name="camera_depth_frame" />

<joint name="camera_depth_optical_joint" type="fixed"> <origin xyz="0 0 0" rpy="-1.57 0 -1.57" /> <parent link="camera_depth_frame" /> <child link="camera_depth_optical_frame" /> </joint> <link name="camera_depth_optical_frame" /> <!--#endregion links-->

After definingt the link and joints in our example_urdf.urdf.xacro, we can add the gazebo plugin that will simulate the camera in our example.gazebo.xacro by adding the following:

<!--#region cams--> <gazebo reference="camera_rgb_frame"> <sensor type="depth" name="realsense_R200"> <always_on>true</always_on> <visualize>true</visualize> <camera> <horizontal_fov>1.3439</horizontal_fov>  <depth_camera></depth_camera> <clip> <near>0.03</near> <far>100</far> </clip> </camera> <plugin name="camera_controller" filename="libgazebo_ros_openni_kinect.so"> <baseline>0.2</baseline> <alwaysOn>true</alwaysOn> <updateRate>30.0</updateRate> <cameraName>camera</cameraName> <frameName>camera_rgb_optical_frame</frameName> <imageTopicName>rgb/image_raw</imageTopicName> <depthImageTopicName>depth/image_raw</depthImageTopicName> <pointCloudTopicName>depth/points</pointCloudTopicName> <cameraInfoTopicName>rgb/camera_info</cameraInfoTopicName> <depthImageCameraInfoTopicName>depth/camera_info</depthImageCameraInfoTopicName> <pointCloudCutoff>0.4</pointCloudCutoff> <hackBaseline>0.07</hackBaseline> <distortionK1>0.0</distortionK1> <distortionK2>0.0</distortionK2> <distortionK3>0.0</distortionK3> <distortionT1>0.0</distortionT1> <distortionT2>0.0</distortionT2> <CxPrime>0.0</CxPrime> <Cx>0.0</Cx> <Cy>0.0</Cy> <focalLength>0</focalLength> <hackBaseline>0</hackBaseline> </plugin> </sensor> </gazebo> <!--#endregion cams-->

Below is a visualization of the camera sensor in rviz.

5.4 Simulation environment:

Now that we’ve modeled our own robot, it’s time to actually simulate it in Gazebo. To do so, we’re going to clone a git repository to download a world file, and add it to our gazebo launch file. The first step is to clone the git repository. To do this, you need to install git, if you don’t already have it. To do so, open a terminal window and enter this command: $ sudo apt install git

Now, let’s navigate to our catkin workspace, and make a new directory called house_world to clone the repository into:

$ cd catkin_ws/src

$ mkdir house_world

Then, navigate to the directory you just made, initialize git, and clone the repository:

$ cd house_world

$ git init

$ git clone https://github.com/fontysrobotics/ARMinor-2020-Opensource-Virtual-Learning-Environment-AROVLE

If you look in the directory you just cloned into, you’ll notice there are 2 new directories there, called worlds and models. These include all the models for a world with a house in it, which we will now add to the launch file.

Open gazebo.launch and add the following;

<include file="$(find gazebo_ros)/launch/empty_world.launch"> <arg name="world_name" value="$(find example_urdf)/worlds/house.world"/> <arg name="paused" value="false"/> <arg name="use_sim_time" value="true"/> <arg name="gui" value="true"/> <arg name="headless" value="false"/> <arg name="debug" value="false"/> </include>

Now launch gazebo.launch to start the simulation.

⚠️ **GitHub.com Fallback** ⚠️