Calculation of mutual information in R

lemhop picture lemhop · Sep 11, 2014 · Viewed 12.1k times · Source

I am having problems interpreting the results of the mi.plugin() (or mi.empirical()) function from the entropy package. As far as I understand, an MI=0 tells you that the two variables that you are comparing are completely independent; and as MI increases, the association between the two variables is increasingly non-random.

Why, then, do I get a value of 0 when running the following in R (using the {entropy} package):

mi.plugin( rbind( c(1, 2, 3), c(1, 2, 3) ) )

when I'm comparing two vectors that are exactly the same?

I assume my confusion is based on a theoretical misunderstanding on my part, can someone tell me where I've gone wrong?

Thanks in advance.

Answer

Monicam picture Monicam · Jul 15, 2016

Use mutinformation(x,y) from package infotheo.

> mutinformation(c(1, 2, 3), c(1, 2, 3) ) 
[1] 1.098612

> mutinformation(seq(1:5),seq(1:5))
[1] 1.609438

and normalized mutual information will be 1.