Markov chain
From Wiktionary, the free dictionary
Archived revision by
Rukhabot
(
talk
|
contribs
)
as of 06:26, 8 April 2019.
(
diff
)
← Older revision
|
Latest revision
(
diff
) |
Newer revision →
(
diff
)
Jump to navigation
Jump to search
Contents
1
English
1.1
Noun
1.1.1
Translations
1.2
See also
English
Noun
Markov
chain
(
plural
Markov chains
)
(
probability theory
)
A discrete-time
stochastic process
with the
Markov property
.
Translations
probability theory
Czech:
Markovův řetězec
m
Estonian:
Markovi ahel
Finnish:
Markovin ketju
Hungarian:
Markov-lánc
(hu)
Icelandic:
Markov-keðja
f
(
deprecated template usage
)
{{
trans-mid
}}
Japanese:
マルコフ連鎖
(
Marukofu rensa
)
Polish:
łańcuch Markowa
m
Russian:
цепь Ма́ркова
f
(
cepʹ Márkova
)
Spanish:
cadena de Márkov
f
See also
Wikipedia article on Markov chains
Categories
:
English lemmas
English nouns
English countable nouns
English multiword terms
en:Probability theory
English eponyms
Hidden categories:
English terms with non-redundant non-automated sortkeys
English entries with language name categories using raw markup
Terms with Czech translations
Terms with Estonian translations
Terms with Finnish translations
Terms with Hungarian translations
Terms with Icelandic translations
Pages using deprecated templates
Terms with Japanese translations
Terms with Polish translations
Terms with Russian translations
Terms with Spanish translations
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Create account
Log in
Namespaces
Entry
Discussion
English
Views
Read
Edit
History
More
Search
Navigation
Main Page
Community portal
Requested entries
Recent changes
Random entry
Help
Glossary
Donations
Contact us
Tools
What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Print/export
Create a book
Download as PDF
Printable version
Print/export
In other languages
Eesti
Français
Polski
தமிழ்
中文