# Analytic distance metric for Gaussian mixture models

In many applications, you need to compare two or more data sets with each other to see how much they are similar or different. For instance, you have measured the height of men and women in Japan and the Netherlands and now you like to know how much they are different.



Two commonly used method for measuring distances are the Kullback-Liebler divergence and the Bhattacharyya distance.

KL divergence (Kullback-Liebler divergence) measures the difference between two probability distributions p and q.

$\LARGE&space;\bg{black}\begin{equation}&space;\label{Kullback_Lieblerdivergence}D{KL}(p||q)=\int_{-\infty}^\infty&space;p(x)\log\frac{p(x)}{q(x)}&space;\,\mathrm{d}x\end{equation}$

But it only works if your data is made of a single Gaussian and it is not applicable If your data is made of a mixture of Gaussians.

Sfikas et al [1]  have extended the Kullback Liebler divergence for GMM and proposed a distance metric using the values ($\LARGE&space;\mu&space;,\Sigma,\pi$ ) for each one of the two distributions in the following form:

$\LARGE&space;\begin{equation}&space;\label{analytical_Kullback_Lieblerdivergence}C2(p||q)=-\log&space;\large[&space;\frac{2\sum{i,j}\pi{i}\pi{j}^{\prime}&space;\sqrt{&space;\frac{|V{ij}|}{e^{k{ij}}|\sum{i}|&space;|\sum{j}^{\prime}|}&space;}&space;}{\sum{i,j}\pi{i}\pi{j}&space;\sqrt{&space;\frac{|V{ij}|}{e^{k{ij}}|\sum{i}|&space;|\sum{j}|}&space;}+\sum{i,j}\pi{i}^{\prime}\pi{j}^{\prime}&space;\sqrt{&space;\frac{|V{ij}|}{e^{k{ij}}|\sum{i}^{\prime}|&space;|\sum{j}^{\prime}|}&space;}}\large]\end{equation}$

Where:

$\LARGE&space;\begin{equation}\label{Kullback_Liebler_divergenceDetails1}V{ij}=(\Sigma{i}^{-1}&space;+\Sigma{j}^{-1})^{-1}\end{equation}$

and

$\LARGE&space;\begin{equation}\label{Kullback_Liebler_divergenceDetails2}K{ij}=\mu{i}^{T}\Sigma{i}^{-1}(\mu{i}-\mu{j}^{\prime})+\mu{j}^{\prime&space;T}\Sigma{j}^{\prime&space;-1}(\mu{j}^{\prime}-\mu{i})\end{equation}$

Code in matlab:

Update: Here is a very nice interactive vizualiztaion of Kullback-Liebler divergence.

Refs: [1]