encyclopedia of medical concepts
ψ 
ψ 
ψ 
ψ 
ψ 
ψ 
ψ 

Markov Chains

More information in Books or onNLM PubMed
Definition: A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.  no qualif    Other names Markov Process; Processes, Markov; Process, Markov; Markov Processes; Chains, Markov; Chain, Markov; Markov Chain

To share this definition, click "text" (Facebook, Twitter) or "link" (blog, mail) then paste text link
Ads by Google

Sources: NLM Medical Subject Headings, NIH UMLS, Drugs@FDA, FDA AERS original data copyright United States Government. No endorsement implied. Last modified 6/6/2012

Warning: the drugs or drug combinations referred to here may be similar or related, but are not be the same ones and may not have the same pharmacological action as other substances described on the same page or in the same row. Please refer to product monograph or to your doctor
This website is accredited by Health On the Net Foundation. Click to verify.
We comply with the HONcode standard for trustworthy health information: verify here.
About Reference.MD Privacy