Markovian

Definition from Wiktionary, the free dictionary
Jump to: navigation, search
See also: markovian

English[edit]

Etymology[edit]

Markov +‎ -ian, named for the Russian mathematician Andrey Markov.

Pronunciation[edit]

Adjective[edit]

Markovian ‎(not comparable)

  1. (statistics, of a process) Exhibiting the Markov property, in which the conditional probability distribution of future states of the process, given the present state and all past states, depends only upon the present state and not on any past states.

Translations[edit]

See also[edit]