Markov chain

From Wiktionary, the free dictionary
Archived revision by Rukhabot (talk | contribs) as of 06:26, 8 April 2019.
Jump to navigation Jump to search

English

Noun

Markov chain (plural Markov chains)

  1. (probability theory) A discrete-time stochastic process with the Markov property.

Translations

See also