I was looking for a way to create an octomap tree from arbitrary mesh file. Well at first I tried to convert my data into PCL pointcloud and then convert them into an octomap tree. But the problem was, for instance when you a cuboid mesh, you only have 8 vertex, which gives you 8 point and the walls between them won’t appear in the final octomap tree. So I found the following solution:
1) First, download the latest version of
binvox from http://www.patrickmin.com/binvox/ (more about binvox, source code available here or you can try this)
2) Convert you mesh file into binvox file i.e
3) grab the
binvox2bt.cpp from octomap at GitHub and compile it, then
4) visualize the bt file install octovis:
sudo apt - get install ros - kinetic - octovis
mesh file (Dude.ply).
octomap bt file (Dude.binvox.bt).
The sample file can be downloaded at
In this tutorial, I explain the concept, probabilistic sensor fusion model and the sensor model used in Octomap library.
OctoMap: An Efficient Probabilistic 3D Mapping Framework Based on Octrees
1)Octamap Volumetric Model
octree storing free (shaded white) and occupied (black) cells. Image is taken from Ref
 2)Probabilistic Sensor Fusion Model
3)Sensor Model for Laser Range Data
Image is taken from Ref
This entry was posted in
Machine Learning, ROS, Tutorials and tagged fusion, laser, machine learning, Octomap, octree, probability, ROS, sensor on . November 27, 2017
Here is an example of obtaining occupancy grid from sensory data of turtlebot robot.
1.First you need to install all dependencies for gazebo and turtlebot and octomap server:
sudo apt - get install ros - indigo - octomap - server ros - indigo - turtlebot ros - indigo - turtlebot - teleop ros - indigo - turtlebot - description ros - indigo - turtlebot - navigation ros - indigo - turtlebot - rviz - launchers ros - indigo - turtlebot - simulator ros - indigo - turtlebot - simulator
2. Launch gazebo in a simulated environment:
roslaunch turtlebot_gazebo turtlebot_world . launch
3.Launch RVIZ to view the published data
roslaunch turtlebot_rviz_launchers view_robot . launch
3.Move your robot via keyboard:
roslaunch turtlebot_teleop keyboard_teleop . launch
to learn how to move the robot with your joystick controller please follow my other tutorial on that here
4. Create a txt file and save it as
Then paste the following lines in that: octomap_turtlebot.launch.
< launch > < node pkg = "octomap_server" type = "octomap_server_node" name = "octomap_server" > < param name = "resolution" value = "0.05" / >
< param name = "frame_id" type = "string" value = "odom" / >
< ! -- maximum range to integrate ( speedup ! ) -- > < param name = "sensor_model/max_range" value = "5.0" / >
< ! -- data source to integrate ( PointCloud2 ) -- > < remap from = "cloud_in" to = "/camera/depth/points" / >
< / node > < / launch >
and call it by
roslaunch octomap_turtlebot . launch
5.Now in the rviz add an
and view the octomap. Of course, you need to remove the floor first 🙂 OccupancyMap