joint entropy

Definition from Wiktionary, the free dictionary
Jump to: navigation, search

English[edit]

Wikipedia has an article on:

Wikipedia

Noun[edit]

joint entropy (plural joint entropies)

  1. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
    If random variables X and Y are mutually independent, then their joint entropy H(X,Y) is just the sum H(X) + H(Y) of its component entropies. If they are not mutually independent, then their joint entropy will be H(X) + H(Y) - I(X;Y) where I(X;Y) is the mutual information of X and Y.

Related terms[edit]