Maximal information-based nonparametric exploration of condition monitoring data

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Jul 5, 2016
Yang Hu Thomas Palmé Olga Fink

Abstract

The system condition of valuable assets such as power plants is often monitored with thousands of sensors. A full evaluation of all sensors is normally not done. Most of the important failures are captured by established algorithms that use a selection of parameters and compare this to defined limits or references.
Due to the availability of massive amounts of data and many different feature extraction techniques, the application of feature learning within fault detection and subsequent prognostics have been increasing. They provide powerful results. However, in many cases, they are not able to isolate the signal or set of signals that caused a change in the system condition.
Therefore, approaches are required to isolate the signals with a change in their behavior after a fault is detected and to provide this information to diagnostics and maintenance engineers to further evaluate the system state.
In this paper, we propose the application of Maximal Information-based Nonparametric Exploration (MINE) statistics for fault isolation and detection in condition monitoring data.
The MINE statistics provide normalized scores for the strength of the relationship, the departure from monotonicity, the closeness to being a function and the complexity. These characteristics make the MINE statistics a good tool for monitoring the pair-wise relationships in the condition monitoring signals and detect changes in the relationship over time.
The application of MINE statistics in the context of condition monitoring is demonstrated on an artificial case study. The focus of the case study is particularly on two of the MINE indicators: the Maximal information coefficient (MIC) and the Maximum Asymmetry Score (MAS).
MINE statistics prove to be particularly useful when the change of system condition is reflected in the relationship between two signals, which is usually difficult to be captured by other metrics.

How to Cite

Hu, Y., Palmé, T., & Fink, O. (2016). Maximal information-based nonparametric exploration of condition monitoring data. PHM Society European Conference, 3(1). https://doi.org/10.36001/phme.2016.v3i1.1625
Abstract 99 | PDF Downloads 261

##plugins.themes.bootstrap3.article.details##

Keywords

Fault Detection, Condition Monitoring, Maximal information

References
Ando, S., & Suzuki, E. (2006). An information theoretic approach to detection of minority subsets in database. In Proceedings - IEEE International Conference on Data Mining, ICDM (pp. 11–20). http://doi.org/10.1109/ICDM.2006.19
Dai, X., & Gao, Z. (2013). From model, signal to knowledge: A data-driven perspective of fault detection and diagnosis. IEEE Transactions on Industrial Informatics, 9(4), 2226–2238. http://doi.org/10.1109/TII.2013.2243743
Fraser, A. M., & Swinney, H. L. (1986). Independent coordinates for strange attractors from mutual information. Physical Review A, 33(2), 1134–1140. http://doi.org/10.1103/PhysRevA.33.1134
Jiang, Q., & Yan, X. (2014). Plant-wide process monitoring based on mutual information-multiblock principal component analysis. ISA Transactions, 53(5), 1516–27. http://doi.org/10.1016/j.isatra.2014.05.031
Kappaganthu, K., & Nataraj, C. (2011). Feature Selection for Fault Detection in Rolling Element Bearings Using Mutual Information. Journal of Vibration and Acoustics, 133(6), 061001. http://doi.org/10.1115/1.4003400
Kraskov, A., Stögbauer, H., & Grassberger, P. (2004). Estimating mutual information. Physical Review. E, Statistical, Nonlinear, and Soft Matter Physics, 69(6 Pt 2), 066138. http://doi.org/10.1103/PhysRevE.69.066138
Moon, Y.-I., Rajagopalan, B., & Lall, U. (1995). Estimation of mutual information using kernel density estimators. Physical Review E, 52(3), 2318–2321. http://doi.org/10.1103/PhysRevE.52.2318
Reshef, D. N., Reshef, Y. A., Finucane, H. K., Grossman, S. R., McVean, G., Turnbaugh, P. J., … Sabeti, P. C. (2011). Detecting novel associations in large data sets. Science (New York, N.Y.), 334(6062), 1518–24. http://doi.org/10.1126/science.1205438
Verron, S., Tiplica, T., & Kobi, A. (2008). Fault detection and identification with a new feature selection based on mutual information. Journal of Process Control, 18(5), 479–490. http://doi.org/10.1016/j.jprocont.2007.08.003
Wu, S., & Wang, S. (2013). Information-theoretic outlier detection for large-scale categorical data. IEEE Transactions on Knowledge and Data Engineering, 25(3), 589–602. http://doi.org/10.1109/TKDE.2011.261
Section
Technical Papers