Hamiltonian Monte Carlo Sampling for Bayesian Hierarchical Regression in Prognostics

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Jul 14, 2017
Lachlan Astfalck Melinda Hodkiewicz

Abstract

Advances in computational speed have enabled the development of many Bayesian probabilistic models due to Markov-Chain-Monte-Carlo (MCMC) posterior sampling methods. These models includes Bayesian hierarchical regression methods, which use group level information to inform individual asset predictions. Hierarchical models are increasingly used for prognostics as they recognise that the parameter estimates for an individual asset may be rationally influenced by data from other similar assets. Larger and high dimensional datasets require more efficient sampling methods for calculations, than traditional MCMC techniques. Hamiltonian Monte Carlo (HMC) has been used across many fields to address high dimensional, sparse, or non-conjugate data. Due to the need to find the posterior derivative and the flexibility in the tuning parameters, HMC is often difficult to hand code. We investigate a probabilistic programming language, Stan, which allows the implementation of HMC sampling, with particular focus on Bayesian hierarchical models in prognostics. The benefits and limitations for HMC using Stan are explored and compared to the widely used Gibbs Sampler and Metropolis-Hastings (MH) algorithm. Results are demonstrated using three case studies on lithiumion batteries. Stan reduced coding complexity and sampled from posterior distributions more efficiently than parameters sampled with the Metropolis-Hastings algorithm. HMC sampling became less efficient with increasing data-size and hierarchical complexity, due to high curvature in the posterior distribution. Stan was shown to be a robust language which allows for easier inference to be made in the Bayesian paradigm.

Abstract 39 | PDF Downloads 68

##plugins.themes.bootstrap3.article.details##

Keywords

PHM

References
Andrieu, C., & Thoms, J. (2008). A tutorial on adaptive mcmc. Statistics and Computing, 18(4), 343–373.
Betancourt, M., & Girolami, M. (2015). Hamiltonian monte carlo for hierarchical models. Current trends in Bayesian methodology with applications, 79, 30.
Blei, D. M., Kucukelbir, A., & McAuliffe, J. D. (2016). Variational Inference: A Review for Statisticians.
Brooks, S., Gelman, A., Jones, G. L., & Meng, X.-L. (2011). Handbook of markov chain monte carlo. Chapman and Hall/CRC.
Brooks, S., & Roberts, G. O. (1998). Convergence assessment techniques for markov chain monte carlo. Statistics and Computing, 8(4), 319–335.
Carpenter, B., Gelman, A., Hoffman, M., Lee, D., Goodrich, B., Betancourt, M., . . . Riddell, A. (2016). Stan: A probabilistic programming language. Journal of Statistical Software, 20.
Coble, J., & Hines, J. W. (2011). Applying the general path model to estimation of remaining useful life. International Journal of Prognostics and Health Management, 2(1), 71–82.
Cripps, E., & Pecht, M. (2017). A bayesian nonlinear random effects model for identification of defective batteries from lot samples. Journal of Power Sources, 342, 342–350.
Gelman, A., Carlin, J. B., Stern, H. S., & Rubin, D. B. (2014). Bayesian data analysis (Vol. 2). Chapman & Hall/CRC Boca Raton, FL, USA.
Gelman, A., et al. (2006). Prior distributions for variance parameters in hierarchical models (comment on article by browne and draper). Bayesian analysis, 1(3), 515–534.
Geman, S., & Geman, D. (1984). Stochastic relaxation, gibbs distributions, and the bayesian restoration of images. IEEE Transactions on pattern analysis and machine intelligence(6), 721–741.
Gilks, W. R., Richardson, S., & Spiegelhalter, D. (1995). Markov chain monte carlo in practice. CRC press.
Green, P. J. (1995). Reversible jump markov chain monte carlo computation and bayesian model determination. Biometrika, 711–732.
Hastings, W. K. (1970). Monte carlo sampling methods using markov chains and their applications. Biometrika, 57(1), 97–109.
He, W., Williard, N., Osterman, M., & Pecht, M. (2011). Prognostics of lithium-ion batteries based on dempster–shafer theory and the bayesian monte carlo method. Journal of Power Sources, 196(23), 10314–10321.
Hoffman, M. D., & Gelman, A. (2014, January). The No-U-turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo. J. Mach. Learn. Res., 15(1), 1593–1623.
Lunn, D. J., Thomas, A., Best, N., & Spiegelhalter, D. (2000). Winbugs - a bayesian modelling framework: Concepts, structure, and extensibility. Statistics and Computing, 10(4), 325–337.
Metropolis, N., Rosenbluth, A.W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (1953). Equation of state calculations by fast computing machines. The journal of chemical physics, 21(6), 1087–1092.
Monnahan, C. C., Thorson, J. T., & Branch, T. A. (2016). Faster estimation of bayesian models in ecology using hamiltonian monte carlo. Methods in Ecology and Evolution.
Neal, R. M., et al. (2011). Mcmc using hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, 2, 113–162.
Orchard, M., Kacprzynski, G., Goebel, K., Saha, B., & Vachtsevanos, G. (2008). Advances in uncertainty representation and management for particle filtering applied to prognostics. In Prognostics and health management, 2008. phm 2008. international conference on (pp. 1–6).
Papaspiliopoulos, O., Roberts, G. O., & Sk¨old, M. (2007). A general framework for the parametrization of hierarchical models. Statistical Science, 59–73.
Plummer, M. (2003). Jags: A program for analysis of bayesian graphical models using gibbs sampling.
Saha, B., & Goebel, K. (2007). Battery data set. NASA Ames Prognostics Data Repository.
Saha, B., Goebel, K., Poll, S., & Christophersen, J. (2009). Prognostics methods for battery health monitoring using a bayesian framework. IEEE Transactions on instrumentation and measurement, 58(2), 291–296.
Salvatier, J., Wiecki, T., & Fonnesbeck, C. (2016). Probabilistic programming in Python using PyMC3. PeerJ Computer Science, 2:e55. Retrieved from https://doi.org/10.7717/peerj-cs.55
Tobon-Mejia, D., Medjaher, K., & Zerhouni, N. (2012). Cnc machine tool’s wear diagnostic and prognostic by using dynamic bayesian networks. Mechanical Systems and Signal Processing, 28, 167–182.
Tran, D., Kucukelbir, A., Dieng, A. B., Rudolph, M., Liang, D., & Blei, D. M. (2016). Edward: A library for probabilistic modeling, inference, and criticism. arXiv preprint arXiv:1610.09787.
Wei, T., Huang, Y., & Chen, C. P. (2009). Adaptive sensor fault detection and identification using particle filter algorithms. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 39(2), 201–213.
Xu, X., Li, Z., & Chen, N. (2016). A hierarchical model for lithium-ion battery degradation prediction. IEEE Transactions on Reliability, 65(1), 310–325.
Zaidan, M. A., Harrison, R. F., Mills, A. R., & Fleming, P. J. (2015). Bayesian hierarchical models for aerospace gas turbine engine prognostics. Expert Systems with Applications, 42(1), 539–553.
Section
Regular Session Papers