Studies in Adaptive and Optimal Control of Stochastic Systems
Project Award Date: 09-01-2014
The adaptive and the optimal control of stochastic systems are major areas for research in control theory and some important problems are proposed for explicit solutions to enable implementation of optimal controllers and determination of the quality of suboptimal controls. The stochastic systems include linear, nonlinear, finite and infinite dimensional. The systems have noise processes that include arbitrary fractional Brownian motions and other processes because these
noises are identified empirically. Likewise a family of stochastic differential games with general noise processes is proposed to determine explicit optimal control strategies. The nonlinear controlled stochastic systems that evolve in some well known geometric spaces, particularly symmetric spaces, are to be solved explicitly. Likewise stochastic differential games are proposed for explicit solutions. Some topics in random matrices are proposed using the geometry
of these matrices. Adaptive control of linear stochastic systems with arbitrary fractional Brownian motions is proposed using explicit optimal controls. The explicit optimal controls for nonlinear systems should make adaptive control problems tractable. The convergence of numerical solutions of continuous time identification algorithms for estimation of parameters for linear systems with fractional Brownian motions is proposed.