Hierarchical Clustering Based on Mutual Information

A. Kraskov, H. Stögbauer, R. G. Andrzejak, and P. Grassberger. Hierarchical clustering based on mutual information. Bioinformatics, pages 1–11, 2003. [url]

————————–

This paper presents a new method for hierarchical clustering of data called mutual information clustering (MIC). It uses the mutial information MI as a similarity measure.

The core idea is that the object to be clustered can be either single finite patterns or random variables. (i.e., probability distribuitions). In the latter case we can compute a similarity measure using the Pearson correlation coefficient.

According to the authors, this technique is not sensitive to nonlinear dependencies which do not manifest themselves in the covariance and therefore it can miss inportant features. On the contrary they propose the MI measure as a good indicator of similarities, saying that this parameter is zero only when the two random variables are strictly independent.

Tags: , ,

Leave a Reply