Analytic distance metric for Gaussian mixture models

In many applications, you need to compare two or more data sets with each other to see how much they are similar or different. For instance, you have measured the height of men and women in Japan and the Netherlands and now you like to know how much they are different.

\( \)

Two commonly used method for measuring distances are the Kullback-Liebler divergence and the Bhattacharyya distance.

KL divergence (Kullback-Liebler divergence) measures the difference between two probability distributions p and q.

But it only works if your data is made of a single Gaussian and it is not applicable If your data is made of a mixture of Gaussians.

Sfikas et al [1]  have extended the Kullback Liebler divergence for GMM and proposed a distance metric using the values ( ) for each one of the two distributions in the following form:

Where:

and

Code in matlab:

 

Update: Here is a very nice interactive vizualiztaion of Kullback-Liebler divergence.

Refs: [1]

0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x