Modern Control Engineering focuses on the methodologies, principles, approaches, and technologies employed in modern control engineering, including dynamic programming, boundary iterations, and linear state equations. The publication fist ponders on state representation of dynamical systems and finite dimensional optimization. Discussions focus on optimal control of dynamical discrete-time systems, parameterization of dynamical control problems, conjugate direction methods, convexity and sufficiency, linear state equations, transition matrix, and stability of discrete-time linear systems. The text then tackles infinite dimensional optimization, including computations with inequality constraints, gradient method in function space, quasilinearization, computation of optimal control-direct and indirect methods, and boundary iterations. The book takes a look at dynamic programming and introductory stochastic estimation and control. Topics include deterministic multivariable observers, stochastic feedback control, stochastic linear-quadratic control problem, general calculation of optimal control by dynamic programming, and results for linear multivariable digital control systems. The publication is a dependable reference material for engineers and researchers wanting to explore modern control engineering.