Gas Turbine Engine Condition Monitoring Using Gaussian Mixture and Hidden Markov Models

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Nov 20, 2020
William R. Jacobs Huw L. Edwards Ping Li Visakan Kadirkamanathan Andrew R. Mills

Abstract

This paper investigates the problem of condition monitoring of complex dynamic systems, specifically the detection, localisation and quantification of transient faults. A data driven approach is developed for fault detection where the multidimensional data sequence is viewed as a stochastic process whose behaviour can be described by a hidden Markov model with two hidden states — i.e. ‘healthy / nominal’ and ‘unhealthy / faulty’. The fault detection is performed by first clustering in a multidimensional data space to define normal operating behaviour using a Gaussian-Uniform mixture model. The health status of the system at each data point is then determined by evaluating the posterior probabilities of the hidden states of a hidden Markov model. This allows the temporal relationship between sequential data points to be incorporated into the fault detection scheme. The proposed scheme is robust to noise and requires minimal tuning. A real-world case study is performed based on the detection of transient faults in the variable stator vane actuator of a gas turbine engine to demonstrate the successful application of the scheme. The results are used to demonstrate the generation of simple and easily interpretable analytics that can be used to monitor the evolution of the fault across time.

Abstract 232 | PDF Downloads 185

##plugins.themes.bootstrap3.article.details##

Keywords

condition monitoring, fault detection, Hidden Markov Model, Gaussian Mixture Model, gas turbine engine

References
Arthur, D., & Vassilvitskii, S. (2007). k-means++: The advantages of careful seeding. In Proceedings of the eighteenth annual acm-siam symposium on discrete algorithms (pp. 1027–1035).
Baldi, P., & Brunak, S. (2001). Bioinformatics: the machine learning approach (Vol. 3) (No. 3). MIT Press. Barnett, V., & Lewis, T. (1974). Outliers in statistical data. Wiley.
Bishop, C. M. (2006). Pattern recognition and machine learning. Springer.
Bl¨omer, J., & Bujna, K. (2016). Adaptive seeding for gaussian mixture models. In Pacific-asia conference on knowledge discovery and data mining (pp. 296–308).
Chandola, V., Banerjee, A., & Kumar, V. (2009, jul). Anomaly detection: A survey. ACM Computing Surveys (CSUR), 41(September), 1–58.
Chen, J., & Patton, R. J. (1999). Robust Model-Based Fault Diagnosis for Dynamic Systems (Vol. 3). Boston, MA: Springer US.
Coretto, P., & Hennig, C. (2009). Identifiability of Gaussian / uniform mixtures. , 1–24.
Dempster, A., Laird, N., & Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society Series B Methodological, 39(1), 1–38.
Ding, S. X. (2013). Model-Based Fault Diagnosis Techniques (Vol. 1). London: Springer London.
Fan, C., Xiao, F., Zhao, Y., & Wang, J. (2018). Analytical investigation of autoencoder-based methods for unsupervised anomaly detection in building energy data. Applied Energy, 211, 1123 - 1135.
Ge, Z., Song, Z., & Gao, F. (2013, mar). Review of Recent Research on Data-Based Process Monitoring. Industrial & Engineering Chemistry Research, 52(10), 3543–3562.
G¨ornitz, N., Braun, M., & Kloft, M. (2015). Hidden markov anomaly detection. In International conference on machine learning (pp. 1833–1842).
G¨ornitz, N., Lima, L. A., M¨uller, K.-R., Kloft, M., & Nakajima, S. (2017). Support vector data descriptions and k-means clustering: One class? IEEE transactions on neural networks and learning systems.
Isermann, R. (2011). Fault-Diagnosis Applications (Vol. 5) (No. 5). Berlin, Heidelberg: Springer Berlin Heidelberg.
Jena, D., & Panigrahi, S. (2014). Motor bike piston-bore fault identification from engine noise signature analysis. Applied Acoustics, 76, 35–47.
Kim, D., & Seo, B. (2014). Assessment of the number of components in gaussian mixture models in the presence of multiple local maximizers. Journal of Multivariate Analysis, 125, 100 - 120.
Li, J., Pedrycz, W., & Jamal, I. (2017). Multivariate time series anomaly detection: A framework of hidden markov models. Applied Soft Computing, 60, 229–240.
Liao, L., Jin, W., & Pavel, R. (2016, nov). Enhanced Restricted Boltzmann Machine with Prognosability Regularization for Prognostics and Health Assessment. IEEE Transactions on Industrial Electronics, 63(11), 7076–7083.
Liu, Q., Dong, M., & Peng, Y. (2012, oct). A novel method for online health prognosis of equipment based on hidden semi-Markov model using sequential Monte Carlo methods. Mechanical Systems and Signal Processing, 32, 331–348.
McKenzie, P., & Alder, M. (1994). Selecting the optimal number of components for a Gaussian mixture model. In Proceedings of 1994 ieee international symposium on information theory (p. 393). IEEE.
McLachlan, G., & Peel, D. (2004). Finite mixture models. John Wiley & Sons.
Mt, A. S. (2005). Robust Statistics. (1986), 1–11.
Nasios, N., & Bors, A. G. (2006). Variational Learning for Gaussian Mixture Models. , 36(4).
Peel, D., & McLachlan, G. J. (2000). Robust mixture modeling using the t distribution. Journal Statistics and Computing, 10, 339–348.
Pimentel, M. A. F., Clifton, D. A., Clifton, L., & Tarassenko, L. (2014). A review of novelty detection (Vol. 99).
Ping Li, & Kadirkamanathan, V. (2001). Particle filtering based likelihood ratio approach to fault diagnosis in nonlinear stochastic systems. IEEE Transactions on Systems, Man and Cybernetics, Part C (Applications and Reviews), 31(3), 337–343.
Rabiner, L. R. (1989). A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceedings of the IEEE, 77(2), 257–286.
Redner, R. A., Walker, H. F., Mathematics, A., & Review, S. (1984). Mixture Densities, Maximum Likelihood and the Em Algorithm. SIAM Review, 26(2), 195–239.
Smyth, P. (1994, jan). Hidden Markov models for fault detection in dynamic systems. Pattern Recognition, 27(1), 149–164.
Suh, S., Chae, D. H., Kang, H.-G., & Choi, S. (2016). Echostate conditional variational autoencoder for anomaly detection. In Neural networks (ijcnn), 2016 international joint conference on (pp. 1015–1022).
Vincent, P., & Bengio, Y. (2003). Manifold parzen windows. In Advances in neural information processing systems (pp. 849–856).
Viterbi, A. J. (1967, apr). Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory, 13(2), 260–269.
Xu, L., & Jordan, M. I. (1996, jan). On Convergence Properties of the EM Algorithm for Gaussian Mixtures. Neural Computation - NECO, 8(1), 129–151.
Yan, W., & Yu, L. (2015). On Accurate and Reliable Anomaly Detection for Gas Turbine Combustors : A Deep Learning Approach. In Phm conference (pp. 1–8).
Section
Technical Papers