Tag Archives: Gazebo

Ackermann steering car robot model with simulation in Gazebo

Most of the wheeled robots in ROS use move_base to move the robot. move_base geometry model is based on differential drive which basically transforms a velocity command (twist message) into a command for rotating the left and the right wheels at a different speed which enable the car to turn into the right or left or goes straight.

Differential drive wheel model. Image Courtesy

But cars have Ackermann steering geometry.

 

Ackermann steering geometry. Image Courtesy.

I was looking for a car robot model with such geometry so I can test it in gazebo and ROS. I didn’t find what I was looking for but I found several packages and with some adaptations, I managed to build and control a car with Ackermann steering geometry with a joystick.

As you can see in the following graph, I’m reading my joystick data and translate them into twist messages (The topic is cmd_vel). Then I translate these messages into Ackermann messages (The topic is ackermann_cmd).

 

Click to enlarge the image.

The robot in the video downloaded from here with some modifications for this work.

Autonomous navigation of two wheels differential drive robot in Gazebo

Two wheels differential drive robot (with two caster wheels).
List of installed sensors:
• Velodyne VLP-16.
• Velodyne HDL-32E.
• Hokuyo Laser scanner.
• IMU.
• Microsoft Kinect/Asus Xtion Pro.
• RGB Camera.

You can manually control the robot with Joystick controller for mapping robot environment.
Autonomous navigation is possible by setting goal pose.

 

Assembling Laser scans into PCL point cloud Using Gazebo and ROS

For this work, first I loaded the RRBot in Gazebo and
launched Its joints controller, then I sent a periodic signal to the robot such that the laser scanner mounted on the robot swings.

In the following, I assembled the incoming laser scans with the transformation from tf and created PCL point cloud.

Install the necessary package:

Add the path to ROS_PACKAGE_PATH

set the second joint value

(/rrbot/joint2_position_controller/command)  into (pi/4)+(1*pi/4)*sin(i/40)*sin(i/40)

and the frequency into 50 Hz.

Laser Assembler:

Finally, run:

Create a launch file and save the following lines to it and run it with roslaunch:

source code at my git hub

SLAM using gmapping with TurtleBot robot and Gazbo

In this tutorial, we do some SLAM with TurtleBot robot.

1.Before anything, you have to install all packages for gazebo and gmapping and TurtleBot:

2.Launch gazebo and TurtleBot

3. Call the gmapper to read laser scan and build the map:

if you got and error, you need to do some hacky stuff and following changes some

comment this line:

and add the following line:

so at the end, your file should look like this:

4.Call the rviz so you can see the map:

5.Move your robot via keyboard or joystick controller:

 

or

or

To make your joystick controller works, you have to hold the “axis_deadman” button. This has been specified in the corresponding launch file. For instance for xbox360 controller edit

you can see, in this file, it has been assigned to 4

 

To see which button on your controller has been assigned to “axis_deadman”, run

and press all buttons one by one until you see which button is 4″

now you have to keep pressing that button for sending a command from your controller. You can watch the state of each button by listening to the /joy topic:

Now start adding objects into gazebo scene and start exploring the environment, you should see the map:

 

Making occupancy grid map in ROS from gazebo with Octomap

Here is an example of obtaining occupancy grid from sensory data of turtlebot robot.

1.First you need to install all dependencies for gazebo and turtlebot and  octomap server:

2. Launch gazebo in a simulated environment:

3.Launch RVIZ to view the published data

3.Move your robot via keyboard:

to learn how to move the robot with your joystick controller please follow my other tutorial on that here

4. Create a txt file and save it as  octomap_turtlebot.launch. Then paste the following lines in that:

 

and call it by

5.Now in the rviz add an OccupancyMap and view the octomap. Of course, you need to remove the floor first 🙂