Shannon entropy

Definition from Wiktionary, the free dictionary
Jump to navigation Jump to search



Named after Claude Shannon, the "father of information theory".


Shannon entropy (countable and uncountable, plural Shannon entropies)

  1. information entropy
    Shannon entropy H is given by the formula where pi is the probability of character number i appearing in the stream of characters of the message.
         Consider a simple digital circuit which has a two-bit input (X, Y) and a two-bit output (X and Y, X or Y). Assuming that the two input bits X and Y have mutually independent chances of 50% of being HIGH, then the input combinations (0,0), (0,1), (1,0), and (1,1) each have a 1/4 chance of occurring, so the circuit's Shannon entropy on the input side is . Then the possible output combinations are (0,0), (0,1) and (1,1) with respective chances of 1/4, 1/2, and 1/4 of occurring, so the circuit's Shannon entropy on the output side is , so the circuit reduces (or "orders") the information going through it by half a bit of Shannon entropy due to its logical irreversibility.

Related terms[edit]

See also[edit]