This is an excellent textbook on dynamic optimization (the title suggests the book focuses on optimal control, but some ideas from dynamic programming and the calculus of variations are provided as well) if you are an engineer or if you are looking for a problem-oriented textbook on the topic. The authors first illustrate the basic ideas of optimization in the stationary/finite-dimensional setting in the first chapter before making use of these very ideas in dynamic optimization problems. The ideas and concepts are directly derived/explained in the text; there are no definitions or formal proofs, which some people used to rigorous mathematical textbooks might find a bit distracting. Still, the concepts are derived in a fairly precise manner (the associated technicalities do tend to be omitted though). The advantage is that you do not need to have any knowledge of functional analysis to comprehend the material. You will do fine with a prior knowledge of calculus, ordinary differential equations and maybe a bit of linear programming (even that is not necessary as the main ideas are explained in the first chapter anyway). There are also numerous solved problems which illustrate how to apply these concepts to primarily engineering/physical problems. However, the book does not cover much on infinite-horizon optimization and related problems, which is widely applied in economics, so if you are curious about that part of dynamic optimization, there are more suitable textbooks than this one.