encyclopedia of medical concepts
ψ
ψ
ψ
ψ
ψ
ψ
ψ
Markov Chains
More information
in Books
or on
Definition
: A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
no qualif
Other names
Markov Process; Processes, Markov; Process, Markov; Markov Processes; Chains, Markov; Chain, Markov; Markov Chain