Limit this search to....

Controlled Diffusion Processes 1980 Edition
Contributor(s): Krylov, N. V. (Author), Aries, A. B. (Translator)
ISBN: 3540709134     ISBN-13: 9783540709138
Publisher: Springer
OUR PRICE:   $104.49  
Product Type: Paperback
Published: October 2008
Qty:
Annotation: This book deals with the optimal control of solutions of fully observable ItA-type stochastic differential equations. The validity of the Bellman differential equation for payoff functions is proved and rules for optimal control strategies are developed.

Topics include optimal stopping; one dimensional controlled diffusion; the Lp-estimates of stochastic integral distributions; the existence theorem for stochastic equations; the ItA formula for functions; and the Bellman principle, equation, and normalized equation.

Additional Information
BISAC Categories:
- Technology & Engineering | Robotics
- Mathematics | Probability & Statistics - General
- Mathematics | Applied
Dewey: 629.831
Series: Applications of Mathematics
Physical Information: 0.8" H x 6.1" W x 9.1" (1.10 lbs) 310 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
Stochastic control theory is a relatively young branch of mathematics. The beginning of its intensive development falls in the late 1950s and early 1960s. urin that period an extensive literature appeared on optimal stochastic control using the quadratic performance criterion (see references in Wonham 76]). At the same time, Girsanov 25] and Howard 26] made the first steps in constructing a general theory, based on Bellman's technique of dynamic programming, developed by him somewhat earlier 4]. Two types of engineering problems engendered two different parts of stochastic control theory. Problems of the first type are associated with multistep decision making in discrete time, and are treated in the theory of discrete stochastic dynamic programming. For more on this theory, we note in addition to the work of Howard and Bellman, mentioned above, the books by Derman 8], Mine and Osaki 55], and Dynkin and Yushkevich 12]. Another class of engineering problems which encouraged the development of the theory of stochastic control involves time continuous control of a dynamic system in the presence of random noise. The case where the system is described by a differential equation and the noise is modeled as a time continuous random process is the core of the optimal control theory of diffusion processes. This book deals with this latter theory.