conditional entropy

Definition from Wiktionary, the free dictionary
Jump to: navigation, search

English[edit]

Wikipedia has an article on:

Wikipedia

Noun[edit]

conditional entropy ‎(plural conditional entropies)

  1. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
    The conditional entropy of random variable given (i.e., conditioned by ), denoted as , is equal to where is the mutual information between and .

Related terms[edit]