Stochastic control theory is a relatively young branch of mathematics.
The beginning of its intensive development falls in the late 1950s and
early 1960s. During that period an extensive literature appeared on
optimal stochastic control using the quadratic performance criterion
(see references in W onham [76J). At the same time, Girsanov [25J and
Howard [26J made the first steps in constructing a general theory,
based on Bellman's technique of dynamic programming, developed by him
somewhat earlier [4J. Two types of engineering problems engendered two
different parts of stochastic control theory. Problems of the first type
are associated with multistep decision making in discrete time, and are
treated in the theory of discrete stochastic dynamic programming. For
more on this theory, we note in addition to the work of Howard and
Bellman, mentioned above, the books by Derman [8J, Mine and Osaki
[55J, and Dynkin and Yushkevich [12]. Another class of engineering
problems which encouraged the development of the theory of stochastic
control involves time continuous control of a dynamic system in the
presence of random noise. The case where the system is described by a
differential equation and the noise is modeled as a time continuous
random process is the core of the optimal control theory of diffusion
processes. This book deals with this latter theory.