# mutual information

## English

Wikipedia has an article on:

Wikipedia

### Noun

mutual information ‎(usually uncountable, plural mutual informations)

1. (information theory) A measure of the entropic (informational) correlation between two random variables.
Mutual information ${\displaystyle I(X;Y)}$ between two random variables ${\displaystyle X}$ and ${\displaystyle Y}$ is what is left over when their mutual conditional entropies ${\displaystyle H(Y|X)}$ and ${\displaystyle H(X|Y)}$ are subtracted from their joint entropy ${\displaystyle H(X,Y)}$. It can be given by the formula ${\displaystyle I(X;Y)=-\sum _{x}\sum _{y}p_{X,Y}(x,y)\log _{b}{p_{X,Y}(x,y) \over p_{X|Y}(x|y)p_{Y|X}(y|x)}}$.