Show simple item record

dc.contributor.authorHardie, Charles H
dc.date.accessioned2016-04-07T15:25:01Z
dc.date.available2016-04-07T15:25:01Z
dc.date.issued1961
dc.date.submitted1961
dc.identifier.otherb1456242x
dc.identifier.urihttps://hdl.handle.net/2144/15538
dc.descriptionThesis (M.A.)--Boston University N.B.: Page 3 of Abstract is incorrectly labeled as Page 2. No content is missing from thesis.
dc.description.abstractMany processes or systems can be fully or partially described by using that part of the probability theory called "stochastic processes." The term stochastic process is usually used when a change of state of a process may occur with time. A Markov process is a stochastic process where the probability of the process being in some state in the future depends only on the state which the process is presently in. A Markov chain is a Markov process which may occupy only a finite number of a denumerably infinite number of states. [TRUNCATED]
dc.language.isoen_US
dc.publisherBoston University
dc.rightsBased on investigation of the BU Libraries' staff, this work is free of known copyright restrictions.
dc.titleFinite Markov chains and recurrent events.
dc.typeThesis/Dissertation
etd.degree.nameMaster of Arts
etd.degree.levelmasters
etd.degree.disciplineMathematics
etd.degree.grantorBoston University


Files in this item

This item appears in the following Collection(s)

Show simple item record