Controlling Tracking Performance for System Health Management - A Markov Decision Process Formulation

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Nov 3, 2020
Brian Bole Kai Goebel George Vachtsevanos

Abstract

After an incipient fault mode has been detected a logical question to ask is: How long can the system continue to be operated before the incipient fault mode degrades to a failure condition? In many cases answering this question is complicated by the fact that further fault growth will depend on how the system is intended to be used in the future. The problem is then complicated even further when we consider that the future operation of a system may itself be conditioned on estimates of a system’s current health and on predictions of future fault evolution. This paper introduces a notationally convenient formulation of this problem as a Markov decision process. Prognostics-based fault management policies are then shown to be identified using standard Markov decision process optimization techniques. A case study example is analyzed, in which a discrete random walk is used to represent time-varying system loading demands. A comparison of fault management policies computed with and without future uncertainty is used to illustrate the limiting effects of model uncertainty on prognostics-informed fault management policies.

Abstract 220 | PDF Downloads 226

##plugins.themes.bootstrap3.article.details##

Keywords

prognostics, uncertainty management, Asset health management, Markov Decision Process, Risk-Reward Trade-off

References
Agha-mohammadi, Ali-akbar, Ure, N. K., How, J. P., & Vian, J. (2014). Health aware stochastic planning for persistent package delivery missions using quadrotors. IEEE Intelligent Robots and Systems.
Balaban, E., & Alonso, J. (2013). A modeling framework for prognostic decision making and its application to UAV mission planning. In Annual conference of the prognostics and health management society.
Banjevic, D., & Jardine, A. (2006). Calculation of reliability function and remaining useful life for a Markov failure time process. IMA Journal of Management Mathematics, 17, 115-130.
Bertsekas, D. (1995). Dynamic programming and optimal control. Athena Scientific.
Boikovic, J. D., & Mehra, R. K. (2002). Control allocation in overactuated aircraft under position and rate limiting. In Proceedings of the american control conference (p. 791-796).
Bole, B., Brown, D. W., Pei, H.-L., Goebel, K., Tang, L., & Vachtsevanos, G. (2010, Oct.). Fault adaptive control of overactuated systems using prognostic estimation. In Annual conference of the prognostics and health management society.
Bole, B., Goebel, K., & Vachtsevanos, G. (2012a). Markov modeling of component fault growth over a derived domain of feasible output control effort modifications. In Annual conference of the prognostics and health management society.
Bole, B., Goebel, K., & Vachtsevanos, G. (2012b). A stochastic optimization paradigm for the development and validation of fault risk assessment and risk management techniques. In AIAA infotech conference.
Bole, B., Tang, L., Goebel, K., & Vachtsevanos, G. (2011). Adaptive load-allocation for prognosis-based risk management. In Annual conference of the prognostics andhealth management society.
Cassandra, A., Kaelbling, L., & Kurien, J. (1996). Acting under uncertainty: Discrete Bayesian models for mobilerobot navigation. In IEEE/RSJ international conference on intelligent robots and systems.
Dong, M., & He, D. (2007). Hidden semi-Markov modelbased methodology for multi-sensor equipment health diagnosis and prognosis. European Journal of Operational Research, 178, 858-878.
Feldman, A., Kurtoglu, T., Narasimhan, S., Poll, S., Garcia, D., Kleer, J., . . . Gemund., A. (2010). Empirical evaluation of diagnostic algorithm performance using a generic framework. International Journal of Prognostics and Health Management, 1(2).
Gokdere, L. U., Bogdanov, A., Chiu, S. L., Keller, K. J., & Vian, J. (2006, March). Adaptive control of actuator lifetime. In IEEE aerospace conference.
Hattori, Y., Koibuchi, K., & Yokoyama., T. (2002, Sept.). Force and moment control with nonlinear optimum distribution for vehicle dynamics. In Proc. of the 6th international symposium on advanced vehicle control.
Hauriea, A., & Moresino, F. (2006). A stochastic control model of economic growth with environmental disaster prevention. Automatica, 42(8), 1417-1428.
Hernandez, D., & Marcus, S. (1996). Risk sensitive control of Markov processes in countable state space. Systems & Control Letters, 29, 147-155.
Jacquet, P., & Szpankowski, W. (2004). Markov types and minimax redundancy for markov sources. IEEE Transactions on Information Theory, 50, 1393-1402.
Lovejoy, W. S. (1991). A survey of algorithmic methods for partially observed Markov decision processes. Annals of Operations Research, 28, 47-66.
Orchard, M., Kacprzynski, G., Goebel, K., Saha, B., & Vachtsevanos, G. (2008). Advances in uncertainty representation and management for particle filtering applied to prognostics. In Annual conference of the prognostics and health management society.
Parlara, M., Wang, Y., & Gerchak, Y. (1995). A periodic review inventory model with Markovian supply availability. International Journal of Production Economics, 42(2), 131-136.
Powell, W. (2007). Approximate dynamic programming: Solving the curses of dimensionality. John Wiley & Sons.
Ruszczyriski, A. (2010). Risk-averse dynamic programming for markov decision processes. Mathematical Programming, 125, 235-261.
Saha, B., Goebel, K., Poll, S., & Christophersen, J. (2009). Prognostics methods for battery health monitoring using a Bayesian framework. IEEE Transactions on Instrumentation and Measurement, 58(2), 291-296.
Schoemaker, P. (1982). The expected utility model: Its variants, purposes, evidence and limitations. Journal of Economic Literature, 529-563.
Serfozo, R. F. (1979). An equivalence between continuous and discrete time Markov decision processes. Operations Research, 27, 616-620.
Shertzer, R. H., Zimpfer, D. J., & Brown., P. D. (2002, Aug.). Control allocation for the next generation of entry vehicles. In AIAA guidance, navigation, and control conference.
Smilowitz, K., & Madanat, S. (1994). Optimal inspection and repair policies for infrastructure facilities. Transportation Science, 28, 55-62.
Sonnenberg, F., & Beck, R. (1993). Markov models in medical decision making. Medical Decision Making, 13(4), 322-338.
Wang, H. S., & Chang, P.-C. (1996). On verifying the first-order Markovian assumption for a Rayleigh fading channel model. IEEE Transactions on Vehicular Technology, 45(2), 353-357.
Zhang, Y., & Jiang, J. (2008). Bibliographical review on reconfigurable fault-tolerant control systems. Annual Reviews in Control, 32, 229-252.
Section
Technical Papers