An Influence Gauge to Detect and Explain Relations between Measurements and a Performance Indicator

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Oct 18, 2015
Jérôme Lacaille

Abstract

What about a software tool that behaves like a gauge able to estimate the quantity of information contained in a group of measurements? Then if we have a performance indicator or a defect rate, how may we compute the maximum performance explanation contained in our dataset? The first question may be answered by entropy and the second with mutual information. The present paper recalls a simple way to use those mathematical tools in an application one wants to launch each time a new dataset has to be studied. Often the PHM team in Snecma is asked to participate in special workforces to analyze sudden crisis. This methodology helps at the very beginning of the process to identify our mathematical capability to build an explanation model. This was the case during a small engine start crisis when some spark plugs were not working. Another time we used this tool to identify the flying condition when a gearbox was heating. This methodology was first developed for industry purposes like the optimization of machine tools or process recipes. Its success is in the simplifications of the computations that enlighten the interpretability of the results. Each signal is quantified in a way that improves the mutual information with the performance indicator. This is done signal by signal, but also for any small subsets of multivariate measurements until the confidence given by the quantity and quality of the data reaches its maximum. The segmentation of the data helps and boosts the computation of the integrals. Moreover, as this methodology uses quantified data as inputs it works as well with any sort of inputs such as continuous, discrete ordered and even categorized measurements. Once a best subset of measurements is selected a simple non-linear model is built using a relaxation algorithm. This model is a set of hypercubes that classifies the input space in a very simple and interpretable way. The methodology given below is a rough approach and may be replaced by more efficient regression algorithms if one only have continuous measurements but it has some advantages like a way to search a “best rule” according to some constraints and a graphic navigation tool very efficient to correct recipes.

How to Cite

Lacaille, J. . (2015). An Influence Gauge to Detect and Explain Relations between Measurements and a Performance Indicator. Annual Conference of the PHM Society, 7(1). https://doi.org/10.36001/phmconf.2015.v7i1.2722
Abstract 187 | PDF Downloads 98

##plugins.themes.bootstrap3.article.details##

Keywords

information theory, Multivariate analysis, Engine Health Monitoring

References
Akaike, H. (1975). Markovian Representation of Stochastic Processes by Canonical Variables. SIAM Journal of Control, 13(1), 162–173. doi:10.1137/0313010

Breiman, L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and Regression Trees. The Wadsworth statisticsprobability series (Vol. 19).

Burges, C. J. C. (1998). A Tutorial on Support Vector Machines for Pattern Recognition. In L. T. Bell Laboratories (Ed.), Data Mining and Knowledge Discovery (Vol. 2, pp. 121–167). Kluwer.

Cochran, W. G. (1934). The distribution of quadratic forms in a normal system, with applications to the analysis of covariance. Mathematical Proceedings of the Cambridge
Philosophical Society, 30(2), 178–191. doi:http://dx.doi.org/10.1017/S0305004100016595

Doquire, G., & Verleysen, M. (2011). Mutual information for feature selection with missing data. In ESANN (pp. 27–29).

Flandrois, X., Lacaille, J., Massé, J.-R., & Ausloos, A. (2009). Expertise Transfer and Automatic Failure Classification for the Engine Start Capability System. In AIAA InfoTech.

Frénay, B., Doquire, G., & Verleysen, M. (2012). On the Potential Inadequacy of Mutual Information for Feature Selection. In ESANN (pp. 25–27). Bruges.

Kalman, R. E. (1960). A New Approach to Linear Filtering and Prediction Problems. Transactions of the ASME- Journal of Basic Engineering, 82(Series D), 35–45. doi:10.1115/1.3662552

Kraskov, A., St, H., & Grassberger, P. (2004). Estimating Mutual Information. Physical Review, 69(6).

Lacaille, J. (1994). Generalization of Stochastic and Deterministic neural Network with a Continuous State Space and a Connectivity Greater than Two. In IEEE World Congress on
Computational Intelligence (Vol. 2). Orlando. doi:10.1109/ICNN.1994.374304

Lacaille, J. (1997). An Autoadaptative Neural Method to Synchronize Multivariate Sensors. In Les réseaux neuro-mimétiques et leurs applications (NEURAP). Marseille.

Lacaille, J. (1998). Synchronization of multivariate sensors with an autoadaptive neural method. Intelligent & Robotic Systems, 21(2), 155–165.

Lacaille, J. (2004). Industrialisation d’algorithmes mathématiques. Université Paris 1. Paris 1, Habilitation thesis.

Lacaille, J. (2005). Mathematical Solution to Identify the Causes of Yield Deterioration. In International Sematech Manufacturing Initiative (ISMI). Austin, TX: Sematech. Retrieved from https://sites.google.com/site/jeromelacaille/semiconduct eurs/ismi-2005---fab-wide-yield-analysis

Lacaille, J. (2008). Global Predictive Monitoring System for a Manufacturing Facility. US: PDF Solutions. Retrieved from http://worldwide.espacenet.com/publicationDetails/bibli o?DB=EPODOC&adjacent=true&locale=en_EP&FT= D&date=20080403&CC=US&NR=2008082197A1&K C=A1

Lacaille, J. (2009). Standardized failure signature for a turbofan engine. In IEEE Aerospace conference. Big Sky (MT): IEEE Aerospace society. doi:10.1109/AERO.2009.4839670

Lacaille, J. (2010). Standardization of Data used for Monitoring an Aircraft Engine. USA; FR. Retrieved from http://worldwide.espacenet.com/publicationDetails/bibli o?DB=EPODOC&adjacent=true&locale=en_EP&FT= D&date=20100708&CC=WO&NR=2010076468A1& KC=A1

Lacaille, J., & Dubus, H. (2005). Defectivity Analysis by a Swarm of Intelligent Distributed Agents. In AEC-APC. Indian Wells, CA: Sematech. Retrieved from https://sites.google.com/site/jeromelacaille/semiconduct eurs/2005-09---aec-apc-indian-wells

Pearl, J. (1988). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann.

Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. In Royal Statistical Society (Vol. 58, pp. 267–288).

Welch, G., & Bishop, G. (2006). An Introduction to the Kalman Filter. In Practice, 7(1), 1–16. doi:10.1.1.117.6808.
Section
Technical Research Papers

Most read articles by the same author(s)