Fault Detection and Prognosis of Time Series Data with Random Projection Filter Bank

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Oct 2, 2017
Sepideh Pourazarm Amir-massoud Farahmand Daniel Nikovski

Abstract

We introduce Random Projection Filter Bank (RPFB) as a general framework for feature extraction from time series data. RPFB is a set of randomly generated stable autoregressive filters that are convolved with the input time series. Filters in RPFB extract different aspects of the time series, and together they provide a reasonably good summary of the time series. These features can then be used by any conventional machine learning algorithm for solving tasks such as time series prediction, and fault detection and prognosis with time series data. RPFB is easy to implement, fast to
compute, and parallelizable. Through a series of experiments we show that RPFB alongside conventional machine learning algorithms can be effective in solving data-driven fault detection and prognosis problems.

How to Cite

Pourazarm, S., Farahmand, A.- massoud, & Nikovski, D. (2017). Fault Detection and Prognosis of Time Series Data with Random Projection Filter Bank. Annual Conference of the PHM Society, 9(1). https://doi.org/10.36001/phmconf.2017.v9i1.2426
Abstract 289 | PDF Downloads 191

##plugins.themes.bootstrap3.article.details##

Keywords

fault detection, time series, Fault Prognosis

References
Baraniuk, R. G., Cevher, V., & Wakin, M. B. (2010). Lowdimensional models for dimensionality reduction and signal recovery: A geometric perspective. Proceedings of the IEEE, 98(6), 959–971.
Baraniuk, R. G., & Wakin, M. B. (2009). Random projections of smooth manifolds. Foundations of computational mathematics, 9(1), 51–77.
Bishop, C. M. (2006). Pattern recognition and machine learning. Springer.
Deutsch, J., & He, D. (2016). Using deep learning based approaches for bearing remaining useful life prediction. In Annual conference of the prognostics and health management society.
Doyle, M., & Newman, J. (1997). Analysis of capacity-rate data for lithium batteries using simplified models of the discharge process. Journal of Applied Electrochemistry, 27(7), 846–856.
Farahmand, A.-m., Pourazarm, S., & Nikovski, D. N. (2017). Random projection filter bank for time series data. In Advances in neural information processing systems (NIPS) (under review).
Ge, D.-F., Hou, B.-P., & Xiang, X.-J. (2007). Study of feature extraction based on autoregressive modeling in ECG automatic diagnosis. Acta Automatica Sinica, 33(5), 462–466.
Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press.
Hastie, T., Tibshirani, R., & Friedman, J. (2001). The elements of statistical learning: Data mining, inference, and prediction. Springer.
Hosking, J. R. M. (1981). Fractional differencing. Biometrika, 68, 165-176.
Kakade, S., Liang, P., Sharan, V., & Valiant, G. (2016). Prediction with a short memory. arXiv:1612.02526.
Kim, S., Park, S., Kim, J.-W., Han, J., An, d., Kim, N. H., & Choi, J.-H. (2016). A new prognostics approach for bearing based on entropy decrease and comparison with existing methods. In Annual conference of the prognostics and health management society.
Lukoševičius, M., & Jaeger, H. (2009). Reservoir computing approaches to recurrent neural network training. Computer Science Review, 3(3), 127–149.
Martin, T. L. (1999). Balancing batteries, power, and performance: System issues in CPU speed-setting for mobile computing (Unpublished doctoral dissertation). Carnegie Mellon University.
Nayak, M., & Panigrahi, B. S. (2011). Advanced signal processing techniques for feature extraction in data mining. International Journal of Computer Applications, 19(9), 30–37.
Oppenheim, A. V., Schafer, R. W., & Buck, J. R. (1999). Discrete-time signal processing (Second ed.). Prentice Hall.
Rahimi, A., & Recht, B. (2009). Weighted sums of random kitchen sinks: Replacing minimization with randomization in learning. In Advances in neural information processing systems (NIPS - 21) (pp. 1313–1320).
Saha, B., & Goebel, K. (2007). Battery data set. NASA Ames Prognostics Data Repository. Retrieved from https://ti.arc.nasa.gov/tech/dash/pcoe/prognostic-data-repository/
Saxena, A., & Goebel, K. (2008). Turbofan engine degradation simulation data set. NASA Ames Prognostics Data Repository. Retrieved from
https://ti.arc.nasa.gov/tech/dash/pcoe/prognostic-data-repository/
Shawe-Taylor, J., & Cristianini, N. (2004). Kernel methods for pattern analysis. Cambridge, UK: Cambridge University Press.
Steinwart, I., & Christmann, A. (2008). Support vector machines. Springer.
Vempala, S. S. (2004). The random projection method. American Mathematical Society.
White, M., Wen, J., Bowling, M., & Schuurmans, D. (2015). Optimal estimation of multivariate ARMA models. In Proceedings of the 29th AAAI conference on artificial intelligence (AAAI).
Zhang, Y., & Guo, B. (2015). Online capacity estimation of lithium-ion batteries based on novel feature extraction and adaptive multi-kernel relevance vector machine. Energies, 8, 12439–12457.
Section
Technical Research Papers