Note: This assumes you have a turtlebot working with an RGBD sensor and are looking to add a LIDAR. This tutorial assumes that you have completed the previous tutorials: ROS tutorials. |
Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags. |
Adding a lidar to the turtlebot using hector_models (Hokuyo UTM-30LX)
Description: A revised tutorial for adding a Hokuyo Lidar to the turtlebot, updated for Indigo and uses the hector_models packageKeywords: Hokuyo, Lidar, Laser, Turtlebot, hector_models
Tutorial Level: INTERMEDIATE
This is what I have found the quickest way to do add a Lidar systematically. This could be done for pretty much any sensor, this example shows adding the Hokuyo UTM-30LX Lidar to a Turtlebot.
Contents
Install the ROS driver for the sensor
sudo apt-get install ros-indigo-hokuyo-node
Note: For newer versions of ROS (Kinetic and above) you might have to use the URG node instead
sudo apt-get install ros-kinetic-urg-node
Plugin the sensor
The UTM-30LX shows up as /dev/ttyACM0 . Ensure that the port has the right permissions by:
ls -l /dev/ttyACM0 sudo chmod a+rw /dev/ttyACM0 ls -l /dev/ttyACM0 $ crw-rw-rw- 1 root dialout 166, 0 Jun 15 10:38 /dev/ttyACM0
To make the permissions change permanent create a file 50-hokuyo.rules.d in /etc/udev/rules with the following lines in it (you will need sudo permission).
Important Make sure you update the address to reflect the ROS Distro version to the one currently installed.
KERNEL=="ttyACM[0-9]*", ACTION=="add", ATTRS{idVendor}=="15d1", MODE="0666", GROUP="dialout", PROGRAM="/opt/ros/indigo/env.sh rosrun hokuyo_node getID %N q", SYMLINK+="sensors/hokuyo_%c"
Then reload the UDEV rules using
sudo udevadm control --reload
Finally unplug and replug the sensor so that the new rules take effect. This should set the correct permissions and additionally create a symbolic link under /dev/sensors/hokuyo_XXXXXXX in case a specific sensor needs to be specified.
Test the sensor
Test that the node starts and publishes data correctly:
# term 1 roscore # term 2 rosrun hokuyo_node hokuyo_node # term 3 rostopic echo /scan
Add Sensor to the Robot URDF
There are many ways to add the sensor to the Robot's URDF. The method I found to be the most straight forward uses the hector_models package which currently contains the URDF and meshes for Asus camera, Flir_a35_camera, Hokuyo UTM30lx, Kinect_camera, realsense camera, sonar_sensor, thermaleye camera and VLP16. Install the hector_models package with
sudo apt-get install ros-indigo-hector-models
Then add the sensor URDF.XACRO into the Robot model. For the turtlebot using the kinect, Now go the file /opt/ros/indigo/share/turtlebot_description/robots/kobuki_hexagons_kinect.urdf.xacro and add the following lines.
Note you will need sudo permissions to edit this file.
Add this right after the <robot> tag
1 <xacro:include filename="$(find hector_sensors_description)/urdf/hokuyo_utm30lx.urdf.xacro"/>
Add this right before the </robot> tag. Make sure this code block passes all the parameters required by the respective sensor URDF. Also update the xyz and rpy parameters to reflect the geometry of the sensor on the robot with respect to the chosen parent link.
1 <hokuyo_utm30lx_model name="hokuyo_laser" parent="base_link">
2 <origin xyz="0 0 +0.30" rpy="0 0 0" />
3 </hokuyo_utm30lx_model>
Test that the xacro file compiles correctly by running:
rosrun xacro xacro kobuki_hexagons_kinect.urdf.xacro > ~/temp.xml
Add the Hokuyo node to new launch file
Copy the default minimal.launch file to minimal_with_hokuyo.launch in /opt/ros/indigo/turtlebot_bringup/launch folder.
Note you will need sudo permissions to do this.
Add the following lines before the </launch> tag towards the end, updating the parameters as required, especially the frame_id:
1 <node name="hokuyo" pkg="hokuyo_node" type="hokuyo_node" respawn="false" output="screen">
2 <param name="calibrate_time" type="bool" value="true"/>
3 <param name="port" type="string" value="/dev/ttyACM0"/>
4 <param name="intensity" type="bool" value="false"/>
5 <param name="min_ang" value="-2.2689"/>
6 <param name="max_ang" value="+2.2689"/>
7 <param name="cluster" value="1"/>
8 <param name="frame_id" value="hokuyo_laser_frame"/>
9 </node>
Check the results by launching minimal_with_hokuyo.launch and see it in rviz
roslaunch turtlebot_bringup minimal_with_hokuyo.launch roslaunch turtlebot_rviz_launchers view_robot.launch
The result should look like:
GMAPPING with the Hokuyo Lidar
In order to use turtlebot's gmapping with the lidar, we need turn off the 'laser' from the RGBD sensor and use the Lidar instead. The basic gmapping package would also work directly, but it doesn't have the move base integrated to it, so the turtlebot gmapping demo is preferable. There are innumerable ways of doing this. Here is one which leave most of the original stuff intact and requires minimal modification.
Modify the gmapping_demo.launch file
Navigate to /opt/ros/indigo/share/turtlebot_navigation/launch and copy gmapping_demo.launch to gmapping_demo_hokuyo.launch. Note: You will need sudo permissions to work in this folder. Find the following line and add the scan_processing parameter as false to turn off the RGBD scan processing:
1 <arg name="depth_processing" value="false" />
2 <!-- Add this line -->
3 <arg name="scan_processing" value="false"/>
Also modify the custom_gmapping_launch_file parameter to direct it to a new file as follows:
1 <!-- Change this line -->
2 <arg name="custom_gmapping_launch_file" default="$(find turtlebot_navigation)/launch/includes/gmapping/$(arg 3d_sensor)_gmapping.launch.xml"/>
3
4 <!-- To -->
5 <arg name="custom_gmapping_launch_file" default="$(find turtlebot_navigation)/launch/includes/gmapping/$(arg 3d_sensor)_hokuyo_gmapping.launch.xml"/>
Modify the (3d_sensor)_gmapping.launch.xml file
Finally modify the gmapping configuration parameter file to reflect parameters for using with laser. Find the correct launch file for your setup in /opt/ros/indigo/share/turtlebot_navigation/launch/includes/gmapping. If you are using a kinect sensor, that file would be kinect_gmapping_launch.xml. Copy the file to kinect_hokuyo_gmapping_launch.xml. \\ //Note: Once again you will need to have sudo permissions to modify these files//\\ Edit the parameters to better match the hokuyo_lidar and the computer's processing power. The key parameters to modify are maxRange and maxUrange to match the laser's max range and maximum usable range respectively. The other parameters found to be critical are iterations, minimumScore, linearUpdate, angularUpdate, temporalUpdate and particles. Try playing with these parameters, since we have copied into a new file you can make changes without worrying about screwing up the default settings.
Run GMAPPING with hokuyo
Now we are ready to run gmapping with the Hokuyo. Ssh into the robot's computer and run:
# Term 1 roslaunch turtlebot_bringup minimal_with_hokuyo.launch # Term 2 roslaunch turtlebot_navigation gmapping_demo_hokuyo.launch
On the workstation computer run:
roslaunch turtlebot_rviz_launchers view_navigation.launch
The result of all this work should look like:
You should be able to use all the original features like providing a 2D Nav Goal and having the robot drive to it.
Additional Resources
https://edu.gaitech.hk/turtlebot/hokuyo-turtlebot.html
http://amanbreakingthings.blogspot.com/2014/11/adding-hokuyo-lidar-to-turtlebot-in-ros.html
http://wiki.ros.org/turtlebot/Tutorials/hydro/Adding%20a%20Hokuyo%20laser%20to%20your%20Turtlebot
http://www.iroboapp.org/index.php?title=Adding_Hokuyo_Laser_Range_Finder_to_Turtlebot
http://wiki.ros.org/hector_models