Tag Archives: simulation

Ackermann steering car robot model with simulation in Gazebo

Most of the wheeled robots in ROS use move_base to move the robot. move_base geometry model is based on differential drive which basically transforms a velocity command (twist message) into a command for rotating the left and the right wheels at a different speed which enable the car to turn into the right or left or goes straight.

Differential drive wheel model. Image Courtesy

But cars have Ackermann steering geometry.


Ackermann steering geometry. Image Courtesy.

I was looking for a car robot model with such geometry so I can test it in gazebo and ROS. I didn’t find what I was looking for but I found several packages and with some adaptations, I managed to build and control a car with Ackermann steering geometry with a joystick.

As you can see in the following graph, I’m reading my joystick data and translate them into twist messages (The topic is cmd_vel). Then I translate these messages into Ackermann messages (The topic is ackermann_cmd).


Click to enlarge the image.

The robot in the video downloaded from here with some modifications for this work.

Autonomous navigation of two wheels differential drive robot in Gazebo

Two wheels differential drive robot (with two caster wheels).
List of installed sensors:
• Velodyne VLP-16.
• Velodyne HDL-32E.
• Hokuyo Laser scanner.
• IMU.
• Microsoft Kinect/Asus Xtion Pro.
• RGB Camera.

You can manually control the robot with Joystick controller for mapping robot environment.
Autonomous navigation is possible by setting goal pose.


Assembling Laser scans into PCL point cloud Using Gazebo and ROS

For this work, first I loaded the RRBot in Gazebo and
launched Its joints controller, then I sent a periodic signal to the robot such that the laser scanner mounted on the robot swings.

In the following, I assembled the incoming laser scans with the transformation from tf and created PCL point cloud.

Install the necessary package:

get gazebo_ros_demos from gitHub

Add the path to ROS_PACKAGE_PATH

due to new updates, you need to make some changes in the file rrbot.gazebo, you have to add

this line <legacyModeNS>true</legacyModeNS>

Now run the followings:

you need to install some plugins for rqt. These plug ins will enable you send messages with rqt.

Now launch rqt_gui:

set the second joint value

(/rrbot/joint2_position_controller/command)  into (pi/4)+(1*pi/4)*sin(i/40)*sin(i/40)

and the frequency into 50 Hz, and /rrbot/joint2_position_controller/command)  into 0

Laser Assembler:

Finally, run:

Create a launch file and save the following lines to it and save it under laser_assembler.launch

and run it with roslaunch:

you should get this :

source code at my git hub

SLAM using gmapping with TurtleBot robot and Gazbo

In this tutorial, we do some SLAM with TurtleBot robot.

1.Before anything, you have to install all packages for gazebo and gmapping and TurtleBot:

2.Launch gazebo and TurtleBot

3. Call the gmapper to read laser scan and build the map:

Only for indigo:

if you got and error, you need to do some hacky stuff and following changes some

comment this line:

and add the following line:

so at the end, your file should look like this:

4.Call the rviz so you can see the map:

5.Move your robot via keyboard or joystick controller:




To make your joystick controller works, you have to hold the “axis_deadman” button. This has been specified in the corresponding launch file. For instance for xbox360 controller edit

you can see, in this file, it has been assigned to 4


To see which button on your controller has been assigned to “axis_deadman”, run

and press all buttons one by one until you see which button is 4″

now you have to keep pressing that button for sending a command from your controller. You can watch the state of each button by listening to the /joy topic:

Now start adding objects into gazebo scene and start exploring the environment, you should see the map: