mutual information

From Wiktionary, the free dictionary
Archived revision by 沈澄心 (talk | contribs) as of 05:30, 18 November 2019.
Jump to navigation Jump to search

English

English Wikipedia has an article on:
Wikipedia

Noun

mutual information (usually uncountable, plural mutual informations)

  1. (information theory) A measure of the entropic (informational) correlation between two random variables.
    Mutual information between two random variables and is what is left over when their mutual conditional entropies and are subtracted from their joint entropy . It can be given by the formula .


Translations

See also