# joint entropy

Definition from Wiktionary, the free dictionary

## English[edit]

### Noun[edit]

**joint entropy** (*plural* **joint entropies**)

- (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
*If random variables and are mutually independent, then their***joint entropy**is just the sum of its component entropies. If they are not mutually independent, then their**joint entropy**will be where is the mutual information of and .