In many applications, you need to compare two or more data sets with each other to see how much they are similar or different. For instance, you have measured the height of men and women in Japan and the Netherlands and now you like to know how much they are different.
\( \)Two commonly used method for measuring distances are the Kullback-Liebler divergence and the Bhattacharyya distance.
KL divergence (Kullback-Liebler divergence) measures the difference between two probability distributions p and q.
But it only works if your data is made of a single Gaussian and it is not applicable If your data is made of a mixture of Gaussians.
Sfikas et al [1] have extended the Kullback Liebler divergence for GMM and proposed a distance metric using the values ( ) for each one of the two distributions in the following form:
Where:
and
Code in matlab:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
function distance = GetDistanceBetweenTwoDists(means1, pis1, covs1, means2, pis2, covs2) %V11 and K11 for i=1:size(means1,1) for j=1:size(means1,1) cov1= covs1(i,:,:); cov2= covs1(j,:,:); cov1 = reshape(cov1,size(covs1,2), size(covs1,2)); cov2 = reshape(cov2,size(covs1,2), size(covs1,2)); V11(i,j) = det (inv(inv(cov1)+inv(cov2))); K11(i,j) = means1(i,:)*inv(cov1)*(means1(i,:)-means1(j,:))'+means1(j,:)*inv(cov2)*(means1(j,:)-means1(i,:))'; end end %V12 and K12 for i=1:size(means1,1) for j=1:size(means2,1) cov1= covs1(i,:,:); cov2= covs2(j,:,:); cov1 = reshape(cov1,size(covs1,2), size(covs1,2)); cov2 = reshape(cov2,size(covs1,2), size(covs1,2)); V12(i,j) = det (inv(inv(cov1)+inv(cov2))); K12(i,j) = means1(i,:)*inv(cov1)*(means1(i,:)-means2(j,:))'+means2(j,:)*inv(cov2)*(means2(j,:)-means1(i,:))'; end end %end %V22 and K22 for i=1:size(means2,1) for j=1:size(means2,1) cov1= covs2(i,:,:); cov2= covs2(j,:,:); cov1 = reshape(cov1,size(covs1,2), size(covs1,2)); cov2 = reshape(cov2,size(covs1,2), size(covs1,2)); V22(i,j) = det (inv(inv(cov1)+inv(cov2))); K22(i,j) = means2(i,:)*inv(cov1)*(means2(i,:)-means2(j,:))'+means2(j,:)*inv(cov2)*(means2(j,:)-means2(i,:))'; end end %end %Sum11 Sum11 = 0; for i=1:size(means1,1) for j=1:size(means1,1) cov1= covs1(i,:,:); cov2= covs1(j,:,:); cov1 = reshape(cov1,size(covs1,2), size(covs1,2)); cov2 = reshape(cov2,size(covs1,2), size(covs1,2)); Sum11 = Sum11 + pis1(i)*pis1(j)*sqrt(V11(i,j)/(exp(K11(i,j))*det(cov1)*det(cov2))); end end %Sum12 Sum12 = 0; for i=1:size(means1,1) for j=1:size(means2,1) cov1= covs1(i,:,:); cov2= covs2(j,:,:); cov1 = reshape(cov1,size(covs1,2), size(covs1,2)); cov2 = reshape(cov2,size(covs1,2), size(covs1,2)); Sum12 = Sum12 + pis1(i)*pis2(j)*sqrt(V12(i,j)/(exp(K12(i,j))*det(cov1)*det(cov2))); end end %Sum22 Sum22 = 0; for i=1:size(means2,1) for j=1:size(means2,1) cov1= covs2(i,:,:); cov2= covs2(j,:,:); cov1 = reshape(cov1,size(covs1,2), size(covs1,2)); cov2 = reshape(cov2,size(covs1,2), size(covs1,2)); Sum22 = Sum22 + pis2(i)*pis2(j)*sqrt(V22(i,j)/(exp(K22(i,j))*det(cov1)*det(cov2))); end end distance = -log(2*Sum12/(Sum11+Sum22)); %distance = 2*Sum12/(Sum11+Sum22); |
Update: Here is a very nice interactive vizualiztaion of Kullback-Liebler divergence.
Refs: [1]