mutual information

Definition from Wiktionary, the free dictionary
Jump to: navigation, search


Wikipedia has an article on:



mutual information ‎(usually uncountable, plural mutual informations)

  1. (information theory) A measure of the entropic (informational) correlation between two random variables.
    Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y) = - \sum_x \sum_y p_{X,Y} (x,y) \log_b {p_{X,Y} (x,y) \over p_{X|Y} (x|y) p_{Y|X} (y|x)}.

See also[edit]