Finite Markov chains and recurrent events.
Hardie, Charles H
MetadataShow full item record
Many processes or systems can be fully or partially described by using that part of the probability theory called "stochastic processes." The term stochastic process is usually used when a change of state of a process may occur with time. A Markov process is a stochastic process where the probability of the process being in some state in the future depends only on the state which the process is presently in. A Markov chain is a Markov process which may occupy only a finite number of a denumerably infinite number of states. [TRUNCATED]
Thesis (M.A.)--Boston University N.B.: Page 3 of Abstract is incorrectly labeled as Page 2. No content is missing from thesis.