There are several approaches for estimating the probability distribution function of a given data: 1)Parametric 2)Semi-parametric 3)Non-parametric A parametric one is GMM via algorithm such as expectation maximization. Here is my other post for expectation maximization. Example of Non-parametric is the histogram, where data are assigned to only one bin and depending on the number bins that fall within …
I found a really good code at GitHub for fitting a Gaussian Mixture Model (GMM) with Expectation Maximization (EM) for ROS. There are so many parameters that you can change. Some of the most important ones are:
// minimum number of gaussians
#define PARAM_NAME_GAUSSIAN_COUNT_MIN "gaussian_count_min"
#define PARAM_DEFAULT_GAUSSIAN_COUNT_MIN 1
// search will terminate when the gaussian count reaches this
#define PARAM_NAME_GAUSSIAN_COUNT_MAX "gaussian_count_max"
#define PARAM_DEFAULT_GAUSSIAN_COUNT_MAX 10
To find the optimal number of components, it uses Bayesian information criterion (BIC). There are other methods to find …