THE BIG SALE IS ON! TELL ME MORE

Close Notification

Your cart does not contain any items

$169.95

Paperback

Not in-store but you can order this
How long will it take?

QTY:

English
Oxford University Press
07 September 1995
A paperback edition of this successful textbook for final year undergraduate mathematicians and control engineering students, this book contains exercises and many worked examples, with complete solutions and hints making it ideal not only as a class textbook but also for individual study. The intorduction to optimal control begins by considering the problem of minimizing a function of many variables, before moving on to the main subject: the optimal control of systems governed by ordinary differential equations.

By:  
Imprint:   Oxford University Press
Country of Publication:   United Kingdom
Edition:   New edition
Dimensions:   Height: 233mm,  Width: 155mm,  Spine: 15mm
Weight:   365g
ISBN:   9780198514893
ISBN 10:   0198514891
Pages:   242
Publication Date:  
Audience:   College/higher education ,  Primary
Format:   Paperback
Publisher's Status:   Active
1: Introduction 1.1: The maxima and minima of functions 1.2: The calculus of variations 1.3: Optimal control Part 2: Optimization in 2.1: Functions of one variable 2.2: Critical points, end-points, and points of discontinuity 2.3: Functions of several variables 2.4: Minimization with constraints 2.5: A geometrical interpretation 2.6: Distinguishing maxima from minima Part 3: The calculus of variations 3.1: Problems in which the end-points are not fixed 3.2: Finding minimizing curves 3.3: Isoperimetric problems 3.4: Sufficiency conditions 3.5: Fields of extremals 3.6: Hilbert's invariant integral 3.7: Semi-fields and the Jacobi condition Part 4: Optimal Control I: Theory 4.1: Introduction 4.2: Control of a simple first-order system 4.3: Systems governed by ordinary differential equations 4.4: The optimal control problem 4.5: The Pontryagin maximum principle 4.6: Optimal control to target curves Part 5: Optimal Control II: Applications 5.1: Time-optimal control of linear systems 5.2: Optimal control to target curves 5.3: Singular controls 5.4: Fuel-optimal controls 5.5: Problems where the cost depends on X (t l) 5.6: Linear systems with quadratic cost 5.7: The steady-state Riccai equation 5.8: The calculus of variations revisited Part 6: Proof of the Maximum Principle of Pontryagin 6.1: Convex sets in 6.2: The linearized state equations 6.3: Behaviour of H on an optimal path 6.4: Sufficiency conditions for optimal control Appendix: Answers and hints for the exercises Bibliography Index

Reviews for Optimal Control and the Calculus of Variations

The author has achieved his aim. Anyone who is curious to know what optimal control theory is all about, or who wishes to begin specializing in this field, would benefit by having this book close at hand. Technical libraries should acquire it, too. . . . highly recommended. --Applied Mechanics Review


See Also