Physics-based prognostics-promises and challenges
##plugins.themes.bootstrap3.article.main##
##plugins.themes.bootstrap3.article.sidebar##
Abstract
In this paper, an interesting observation on the noisedependent performance of prognostics algorithms is presented, as well as a method of evaluating the accuracy of prognostics algorithms without having the true degradation model is presented. We found that the randomness in the noise leads to very different ranking of the algorithms for different datasets. In particular, even for the algorithm that has the best performance on average, poor results can be obtained for some datasets. In absence of true damage information, we propose a metric, mean squared discrepancy (MSD), which measures the difference between the prediction and the data. It is shown that the ranking by MSD is strongly correlated with ranking with true degradation model. This may be particularly useful when information is available from multiple sites of damage for the same application.
##plugins.themes.bootstrap3.article.details##
PHM
Saxena A, Jose C, Bhaskar S, Sankalita S, Kai G. Metrics for offline evaluation of prognostic performance. International Journal of Prognostics and Health Management 2010;1: 4-23.