Finite Markov chains and recurrent events.
|dc.contributor.author||Hardie, Charles H||en_US|
|dc.description||Thesis (M.A.)--Boston University N.B.: Page 3 of Abstract is incorrectly labeled as Page 2. No content is missing from thesis.||en_US|
|dc.description.abstract||Many processes or systems can be fully or partially described by using that part of the probability theory called "stochastic processes." The term stochastic process is usually used when a change of state of a process may occur with time. A Markov process is a stochastic process where the probability of the process being in some state in the future depends only on the state which the process is presently in. A Markov chain is a Markov process which may occupy only a finite number of a denumerably infinite number of states. [TRUNCATED]||en_US|
|dc.rights||Based on investigation of the BU Libraries' staff, this work is free of known copyright restrictions.||en_US|
|dc.title||Finite Markov chains and recurrent events.||en_US|
|etd.degree.name||Master of Arts||en_US|
This item appears in the following Collection(s)