Markov process

Definition from Wiktionary, the free dictionary
Jump to: navigation, search

English[edit]

Wikipedia has an article on:

Wikipedia

Noun[edit]

Markov process (plural Markov processes)

  1. (probability theory) A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states.

Related terms[edit]

Translations[edit]