# mutual information

Mutual information ${\displaystyle I(X;Y)}$ between two random variables ${\displaystyle X}$ and ${\displaystyle Y}$ is what is left over when their mutual conditional entropies ${\displaystyle H(Y|X)}$ and ${\displaystyle H(X|Y)}$ are subtracted from their joint entropy ${\displaystyle H(X,Y)}$. It can be given by the formula ${\displaystyle I(X;Y)=-\sum _{x}\sum _{y}p_{X,Y}(x,y)\log _{b}{p_{X,Y}(x,y) \over p_{X|Y}(x|y)p_{Y|X}(y|x)}}$.