Application of Model-based Deep Reinforcement Learning Framework to Thermal Power Plant Operation Considering Performance Change
##plugins.themes.bootstrap3.article.main##
##plugins.themes.bootstrap3.article.sidebar##
Abstract
In recent years, there have been increasing expectations for the development of advanced plant operational support systems that can automate complex tasks and autonomously
optimize operational procedures in thermal power plants. The performance of the equipment changes during operation and maintenance; hence, it is necessary to adjust the operating process to satisfy the operational constraints. In this study, we investigated a framework based on model-based deep reinforcement learning for acquiring control methods that are robust to changes in equipment performance using a digital twin model. A case study of the operational planning of a thermal power plant was presented and it was demonstrated that a stable control system can be constructed even when plant characteristics are changing.
##plugins.themes.bootstrap3.article.details##
Reinoforcement learning, Digital twin, Thermal power generation, Operational flexibility
Cheng, Y., Huang, Y., Pang, B., and Zhang W., (2018). ThermalNet: A deep reinforcement learning-based combustion optimization system for coal-fired boiler, Engineering Application of Artificial Intelligence, 74.
Kubosawa, S., Onishi, T., Tsuruoka, Y., (2022). Sim-to-real transfer in reinforcement learning-based, non-steadystate control for chemical plants, SICE Journal of Control, Measurement, and System Integration, Vol.15, No1, pp.10-23. doi: 10.1080/18824889.2022.2029033
Takahashi, T., Nakamoto, M., and Watanabe, Y., (2016). Construction of dynamic analysis tool for thermal power systems, CRIEPI report, M15005.
Watanabe, Y., Takahashi, T., and Suzuki, K., (2022). Dynamic Simulation of an oxygen-hydrogen combustion turbine system using Modelica, Proceedings of Asian Modelica Conference 2022, Tokyo, Japan. doi: https://doi.org/10.3384/ecp19315
This work is licensed under a Creative Commons Attribution 3.0 Unported License.