Reinforcement learning for condition-based control of gas turbine engines

Autor: Ibrahim Sanusi, Visakan Kadirkamanathan, Paul Trodden, Andrew R. Mills, Tony J. Dodd
Rok vydání: 2019
Předmět:
Zdroj: ECC
DOI: 10.23919/ecc.2019.8795878
Popis: A condition-based control framework is proposed for gas turbine engines using reinforcement learning and adaptive dynamic programming (RL-ADP). The system behaviour, specifically the fuel efficiency function and constraints, exhibit unknown degradation patterns which vary from engine to engine. Due to these variations, accurate system models to describe the true system states over the life of the engines are difficult to obtain. Consequently, model-based control techniques are unable to fully compensate for the effects of the variations on the system performance. The proposed RL-ADP control framework is based on Q-learning and uses measurements of desired performance quantities as reward signals to learn and adapt the system efficiency maps. This is achieved without knowledge of the system variation or degradation dynamics, thus providing a through life adaptation strategy that delivers improved system performance. In order to overcome the long standing difficulties associated with the application of adaptive techniques in a safety critical setting, a dual-control loop structure is proposed in the implementation of the RL-ADP scheme. The overall control framework maintains guarantees on the main thrust control loop whilst extracting improved performance as the engine degrades by tuning sets of variable geometry components in the RL-ADP control loop. Simulation results on representative engine data sets demonstrate the effectiveness of this approach as compared to an industry standard gain scheduling.
Databáze: OpenAIRE