Formal verification and control of discrete-time stochastic systems
Lahijanian, Morteza M
MetadataShow full item record
This thesis establishes theoretical and computational frameworks for formal verification and control synthesis for discrete-time stochastic systems. Given a temporal logic specification, the system is analyzed to determine the probability that the specification is achieved, and an input law is automatically generated to maximize this probability. The approach consists of three main steps: constructing an abstraction of the stochastic system as a finite Markov model, mapping the given specification onto this abstraction, and finding a control policy to maximize the probability of satisfying the specification. The framework uses Probabilistic Computation Tree Logic (PCTL) as the specification language. The verification and synthesis algorithms are inspired by the field of probabilistic model checking. In abstraction, a method for the computation of the exact transition probability bounds between the regions of interest in the domain of the stochastic system is first developed. These bounds are then used to construct an Interval-valued Markov Chain (IMC) or a Bounded-parameter Markov Decision Process (BMDP) abstraction for the system. Then, a representative transition probability is used to construct an approximating Markov chain (MC) for the stochastic system. The exact bound of the approximation error and an explicit expression for its grovvth over time are derived. To achieve a desired error value, an adaptive refinement algorithm that takes advantage of the linear dynamics of the system is employed. To verify the properties of the continuous domain stochastic system against a finite-time PCTL specification, IMC and BMDP verification algorithms are designed. These algorithms have low computational complexity and are inspired by the MC model checking algorithms. The low computational complexity is achieved by over approximating the probabilities of satisfaction. To increase the precision of the method, two adaptive refinement procedures are proposed. Furthermore, a method of generating the control strategy that maximizes the probability of satisfaction of a PCTL specification for Markov Decision Processes (MDPs) is developed. Through a similar method, a formal synthesis framework is constructed for continuous domain stochastic systems by utilizing their BMDP abstractions. These methodologies are then applied in robotics applications as a means of automatically deploying a mobile robot subject to noisy sensors and actuators from PCTL specifications. This technique is demonstrated through simulation and experimental case studies of deployment of a robot in an indoor environment. The contributions of the thesis include verification and synthesis frameworks for discrete time stochastic linear systems, abstraction schemes for stochastic systems to MCs, IMCs, and BMDPs, model checking algorithms with low computational complexity for IMCs and BMDPs against finite-time PCTL formulas, synthesis algorithms for Markov Decision Processes (MDPs) from PCTL formulas, and a computational framework for automatic deployment of a mobile robot from PCTL specifications. The approaches were validated by simulations and experiments. The algorithms and techniques in this thesis help to make discrete-time stochastic systems a more useful and effective class of models for analysis and control of real world systems.
Thesis (Ph.D.)--Boston University PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at email@example.com. Thank you.