This new 4th edition offers an introduction to optimal
control theory and its diverse applications in management science and
economics. It introduces students to the concept of the maximum
principle in continuous (as well as discrete) time by combining dynamic
programming and Kuhn-Tucker theory. While some mathematical background
is needed, the emphasis of the book is not on mathematical rigor, but on
modeling realistic situations encountered in business and economics. It
applies optimal control theory to the functional areas of management
including finance, production and marketing, as well as the economics of
growth and of natural resources. In addition, it features material on
stochastic Nash and Stackelberg differential games and an adverse
selection model in the principal-agent framework.
Exercises are included in each chapter, while the answers to selected
exercises help deepen readers' understanding of the material covered.
Also included are appendices of supplementary material on the solution
of differential equations, the calculus of variations and its ties to
the maximum principle, and special topics including the Kalman filter,
certainty equivalence, singular control, a global saddle point theorem,
Sethi-Skiba points, and distributed parameter systems.
Optimal control methods are used to determine optimal ways to control a
dynamic system. The theoretical work in this field serves as the
foundation for the book, in which the author applies it to business
management problems developed from his own research and classroom
instruction. The new edition has been refined and updated, making it a
valuable resource for graduate courses on applied optimal control
theory, but also for financial and industrial engineers, economists, and
operational researchers interested in applying dynamic optimization in
their fields.