mutual information
English[edit]
Noun[edit]
mutual information (usually uncountable, plural mutual informations)
- (information theory) A measure of the entropic (informational) correlation between two random variables.
- Mutual information between two random variables and is what is left over when their mutual conditional entropies and are subtracted from their joint entropy . It can be given by the formula .
See also[edit]
- total correlation on Wikipedia.Wikipedia
- interaction information on Wikipedia.Wikipedia