[R] entropy package: how to compute mutual information?
Sam Steingold
sds at gnu.org
Mon Feb 13 22:14:36 CET 2012
suppose I have two factor vectors:
x <- as.factor(c("a","b","a","c","b","c"))
y <- as.factor(c("b","a","a","c","c","b"))
I can compute their entropies:
entropy(table(x))
[1] 1.098612
using
library(entropy)
but it is not clear how to compute their mutual information directly.
I can compute the joint entropy as
entropy(table(paste(x,y,sep="")))
[1] 1.791759
and then mutual information will be h(x) + h(y) - h(x,y) =
1.098612 + 1.098612 - 1.791759
0.405465
but I was wondering whether there was a better way (without creating a
fresh factor vector and a fresh factor class, both of which are
immediately discarded).
--
Sam Steingold (http://sds.podval.org/) on Ubuntu 11.10 (oneiric) X 11.0.11004000
http://www.childpsy.net/ http://iris.org.il http://ffii.org http://camera.org
http://americancensorship.org http://dhimmi.com http://pmw.org.il
There is Truth, and its value is T. Or just non-NIL. So 0 is True!
More information about the R-help
mailing list