This paper examines the optimal control processes represented by stochastic sequential dynamic systems involving a parameter obtained by unique solution conditions concerning constant input values. Then, the principle of optimality is proven for the considered process. Afterwards, the Bellman equation is constructed by applying the dynamic programming method. Moreover, a particular set defined as an accessible set is established to show the existence of an optimal control problem. Finally, it is discussed the need for further research.
Optimal control process Bellman’s equation Dynamical programming Stochastic sequential dynamical systems
Primary Language | English |
---|---|
Subjects | Mathematical Sciences |
Journal Section | Articles |
Authors | |
Publication Date | August 31, 2021 |
Published in Issue | Year 2021 Volume: 10 Issue: 2 |
As of 2021, JNRS is licensed under a Creative Commons Attribution-NonCommercial 4.0 International Licence (CC BY-NC).