Deep Feature Learning Network for Fault Detection and Isolation

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Oct 2, 2017
Gabriel Michau Thomas Palm´ Olga Fink

Abstract

Prognostics and Health Management (PHM) approaches typically involve several signal processing and feature engineering steps. The state of the art on feature engineering, comprising feature extraction and feature dimensionality reduction, often only provides specific solutions for specific problems, but rarely supports transferability or generalization: it often requires expert knowledge and extensive intervention. In this paper, we propose a new integrated feature learning approach for jointly achieving fault detection and fault isolation in high-dimensional condition monitoring data. The proposed approach, based on Hierarchical Extreme Learning Machines (HELM) demonstrates a good ability to detect and isolate faults in large datasets comprising signals of different natures, non-informative signals, non-linear relationships and noise. The method includes stacked auto-encoders that are able to learn the underlying high-level features, and a one-class classifier to combine the learned features in an indicator that represents the deviation from the normal system behavior. Once a deviation is identified, features are used to isolate the most deviating signal components. Two case studies highlight the benefits of the approach: First, a synthetic dataset with the typical characteristics of condition monitoring data and different types of faults is applied to evaluate the performance with objective metrics. Second, the approach is tested on data stemming from a power plant generator interturn failure. In both cases, the results are compared to other commonly applied approaches for fault isolation.

How to Cite

Michau, G., Palm´, T., & Fink, O. (2017). Deep Feature Learning Network for Fault Detection and Isolation. Annual Conference of the PHM Society, 9(1). https://doi.org/10.36001/phmconf.2017.v9i1.2380
Abstract 586 | PDF Downloads 412

##plugins.themes.bootstrap3.article.details##

Keywords

Extreme Learning Machine, Artificial Neural Network, Remaining Useful Life, fault detection, fault isolation

References
Beck, A. & Teboulle, M. (2009, January 1). A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal on Imaging Sciences, 2(1), 183–202.
Bengio, Y., Courville, A. & Vincent, P. (2013, August). Representation learning: A review and new perspectives. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(8), 1798–1828.
Cambria, E., Huang, G.-B., Kasun, L. L. C., Zhou, H., Vong, C. M., Lin, J., . . . Li, K. et al. (2013). Extreme learning machines [trends & controversies]. IEEE Intelligent Systems, 28(6), 30–59.
Cao, L.-l., Huang, W.-b. & Sun, F.-c. (2016, January 22). Building feature space of extreme learning machine with sparse denoising stacked-autoencoder. Neurocomputing, 174, Part A, 60–71.
Chambolle, A., Dossal, C. et al. (2014). How to make sure the iterates of FISTA converge.
Forman, G. (2003). An extensive empirical study of feature selection metrics for text classification. Journal of Machine Learning Research, 3, 1289–1305.
Hinton, G. E., Osindero, S. & Teh, Y.-W. (2006, May 17). A fast learning algorithm for deep belief nets. Neural Computation, 18(7), 1527–1554.
Hu, Y., Palmé, T. & Fink, O. (2016). Deep health indicator extraction: A method based on auto-encoders and extreme learning machines. In Annual Conference of the Prognostics and Health Management Society 2016 (Vol. 7, p. 7). Phm society conference.
Huang, G.-B. (2015, June 1). What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cognitive Computation, 7(3), 263–278.
Huang, G.-B., Chen, L., Siew, C. K. et al. (2006). Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks, 17(4), 879–892.
Huang, G.-B., Zhu, Q.-Y. & Siew, C.-K. (2004, July). Extreme learning machine: A new learning scheme of feedforward neural networks. In 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541) (Vol. 2, 985–990 vol.2). 2004 ieee international joint conference on neural networks (ieee cat. no.04ch37541).
LeCun, Y., Bengio, Y. & Hinton, G. (2015, May 28). Deep learning. Nature, 521(7553), 436–444.
Leng, Q., Qi, H., Miao, J., Zhu, W. & Su, G. (2015, May 26). One-class classification with extreme learning machine. Mathematical Problems in Engineering, 2015, e412957.
Michau, G., Yang, H., Palmé, T. & Fink, O. (2017, April). Feature learning for fault detection in high-dimensional condition-monitoring signals. submitted, 1–10.
Miche, Y., Heeswijk, M. van, Bas, P., Simula, O. & Lendasse, A. (2011, September). TROP-ELM: A doubleregularized ELM using LARS and Tikhonov regularization. Neurocomputing. Advances in Extreme Learning Machine: Theory and ApplicationsBiological Inspired Systems. Computational and Ambient IntelligenceSelected papers of the 10th International Work-Conference on Artificial Neural Networks (IWANN2009), 74(16), 2413–2421.
Miotto, R., Li, L., Kidd, B. A. & Dudley, J. T. (2016, May 17). Deep patient: An unsupervised representation to predict the future of patients from the electronic health records. Scientific Reports, 6. pmid: 27185194
Penman, J., Sedding, H. G., Lloyd, B. A. & Fink, W. T. (1994, December). Detection and location of interturn short circuits in the stator windings of operating motors. IEEE Transactions on Energy Conversion, 9(4), 652–658.
Principe, J. C. & Chen, B. (2015, May). Universal Approximation with Convex Optimization: Gimmick or Reality? [Discussion Forum]. IEEE Computational Intelligence Magazine, 10(2), 68–77.
Sahoo, N. C., Salama, M. M. A. & Bartnikas, R. (2005, April). Trends in partial discharge pattern classification: A survey. IEEE Transactions on Dielectrics and Electrical Insulation, 12(2), 248–264.
Sahraoui, M., Zouzou, S. E., Ghoggal, A. & Guedidi, S. (2010, September). A new method to detect inter-turn short-circuit in induction motors. In The XIX International Conference on Electrical Machines - ICEM 2010 (pp. 1–6). The xix international conference on electrical machines - icem 2010.
Tang, J., Deng, C. & Huang, G.-B. (2016). Extreme learning machine for multilayer perceptron. IEEE transactions on neural networks and learning systems, 27(4), 809–821.
Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y. & Manzagol, P.-A. (2010). Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. Journal of Machine Learning Research, 11, 3371–3408.
Yan, W. & Yu, L. (2015). On accurate and reliable anomaly detection for gas turbine combustors: A deep learning approach. In Proceedings of the Annual Conference of the Prognostics and Health Management Society.
Yang, A. Y., Sastry, S. S., Ganesh, A. & Ma, Y. (2010, September). Fast l1-minimization algorithms and an application in robust face recognition: A review. In 2010 IEEE International Conference on Image Processing (pp. 1849–1852). 2010 ieee international conference on image processing.
Yang, Y. & Wu, Q. M. J. (2016, November). Multilayer extreme learning machine with subnetwork nodes for representation learning. IEEE Transactions on Cybernetics, 46(11), 2570–2583.
Section
Technical Research Papers