# conditional entropy

The conditional entropy of random variable ${\displaystyle Y}$ given ${\displaystyle X}$ (i.e., conditioned by ${\displaystyle X}$), denoted as ${\displaystyle H(Y|X)}$, is equal to ${\displaystyle H(Y)-I(Y;X)}$ where ${\displaystyle I(Y;X)}$ is the mutual information between ${\displaystyle Y}$ and ${\displaystyle X}$.