A Transfer Learning Framework for Remaining Useful Life Estimation

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Jan 13, 2026
Melanie Bianca Sigl
Klaus Meyer-Wegener

Abstract

Training a robust deep learning (DL) model for remaining useful life (RUL) estimation or fault detection typically requires a large, high-quality labeled dataset. However, such datasets are often unavailable in practice. Transfer learning is a solution for smaller labelled datasets. Yet, the effectiveness of transfer learning heavily depends on selecting an appropriate source DL model; an unsuitable choice can result in negative transfer, where model performance deteriorates significantly.

To address this challenge, we introduce REAPER (Reusable Neural Network Pattern Repository), a framework designed to assist users in selecting the most suitable DL model for reuse in transfer learning scenarios. REAPER analyzes and compares the characteristics of available datasets and employs a learned ranking model to recommend the optimal source model. This paper presents the architecture, including its dataset characterization, ranking methodology, training procedure, and practical usage guidance.

Abstract 34 | PDF Downloads 27

##plugins.themes.bootstrap3.article.details##

Keywords

Transfer Learning, Remaining Useful Life Estimation, Model Selection, Deep Learning Framework

References
Bhardwaj, A. P., Bhattacherjee, S., Chavan, A., Deshpande, A., Elmore, A. J., Madden, S., & Parameswaran, A. G. (2015). DataHub: Collaborative Data Science & Dataset Version Management at Scale. In CIDR.
Blum, N., Krespach, V., Zapp, G., Oehse, C., Rehfeldt, S., & Klein, H. (2021, 11). Investigation of a Model-Based Deep Reinforcement Learning Controller Applied to an Air Separation Unit in a Production Environment. Chemie Ingenieur Technik, 93(12), 1937–1948.
Burges, C. J. C., Shaked, T., Renshaw, E., Lazier, A., Deeds, M., Hamilton, N., & Hullender, G. N. (2005). Learning to Rank Using Gradient Descent. In L. D. Raedt & S. Wrobel (Eds.), ICML (Vol. 119, pp. 89–96). ACM.
Castro Fernandez, R., Abedjan, Z., Koko, F., Yuan, G., Madden, S., & Stonebraker, M. (2018, 4). Aurum: A Data Discovery System. ICDE.
Das, A., Hussain, S., Yang, F., Habibullah, M. S., & Kumar, A. (2019). Deep Recurrent Architecture with Attention for Remaining Useful Life Estimation. IEEE Region 10 Conference (TENCON)-
Fan, Y., Nowaczyk, S., & Rognvaldsson, T. S.¨ (2019). Transfer learning for Remaining Useful Life Prediction Based on Consensus Self-Organizing Models. CoRR, abs/1909.07053.
Fawaz, H. I., Forestier, G., Weber, J., Idoumghar, L., & Muller, P. (2018). Transfer Learning for Time Series Classification. In IEEE international conference on big data (pp. 1367–1376).
Fielding, R. T. (2000). Architectural Styles and the Design of Network-based Software Architectures (Unpublished doctoral dissertation). University of California, Irvine.
Halevy, A., Korn, F., Noy, N. F., Olston, C., Polyzotis, N., Roy, S., & Whang, S. E. (2016). Goods: Organizing Google’s Datasets. SIGMOD.
Han, T., Liu, C., Wu, R., & Jiang, D. (2021). Deep Transfer Learning with Limited Data for Machinery Fault Diagnosis. Applied Soft Computing, 103, 107150.
Huang, W., Khorasgani, H., Gupta, C., Farahat, A., & Zheng, S. (2018). Remaining Useful Life Estimation for Systems with Abrupt Failures. Proceedings of the Annual Conference of the PHM Society, 10(1).
Istrate, R., Scheidegger, F., Mariani, G., Nikolopoulos, D. S., Bekas, C., & Malossi, A. C. I. (2019). TAPAS: TrainLess Accuracy Predictor for Architecture Search. In AAAI (pp. 3927–3934).
Jarvelin, K., & Kek¨ al¨ ainen, J. (2002). Cumulated Gain-Based¨ Evaluation of IR Techniques. ACM Transactions on Information Systems, 20(4), 422–446.
Listou Ellefsen, A., Bjørlykhaug, E., Æsøy, V., Ushakov, S., & Zhang, H. (2019, 3). Remaining Useful Life Predictions for Turbofan Engine Degradation Using Semisupervised Deep Architecture. , 183, 240–251.
Liu, T. (2011). Learning to Rank for Information Retrieval. Springer.
Maddox, M., Goehring, D., Elmore, A. J., Madden, S., Parameswaran, A., & Deshpande, A. (2016, 5). Decibel: The Relational Dataset Branching System. Proceedings of the VLDB Endowment, 9(9), 624–635.
Mao, W., He, J., & Zuo, M. J. (2020, 4). Predicting Remaining Useful Life of Rolling Bearings Based on Deep Feature Representation and Transfer Learning. IEEE Transactions on Instrumentation and Measurement, 69(4), 1594–1608.
Miao, H., Li, A., Davis, L. S., & Deshpande, A. (2017, 4). ModelHub: Deep Learning Lifecycle Management. ICDE.
Moradi, R., & Groth, K. M. (2020). On the Application of Transfer Learning in Prognostics and Health Management. CoRR, abs/2007.01965.
Noot, J.-P., Martin, M., & Birmele, E. (2025). LSTM and Transformers based methods for Remaining Useful Life Prediction considering Censored Data. IJPHM, 16(2).
Noy, N. F., Burgess, M., & Brickley, D. (2019). Google Dataset Search: Building a search engine for datasets in an open Web ecosystem. In The world wide web conference (pp. 1365–1375).
Pan, S. J., & Yang, Q. (2010). A Survey on Transfer Learning. IEEE Transactions on Knowledge and Data Engineering, 22(10), 1345–1359.
Ruder, S., & Plank, B. (2017). Learning to select data for transfer learning with Bayesian Optimization. In M. Palmer, R. Hwa, & S. Riedel (Eds.), EMNLP (pp. 372–382). Association for Computational Linguistics.
Saha, B., & Goebel, K. (2007). Battery Data Set. Retrieved 2022-05-09, from [https://www.nasa.gov/](https://www.nasa.gov/) intelligent-systems-division/ discovery-and-systems-health/pcoe/ pcoe-data-set-repository/
Sigl, M. B., & Meyer-Wegener, K. (2025). Towards Learning to Rank Deep-Learning Models for Multivariate TimeSeries Transfer Learning. In DEEM (pp. 2:1–2:9).
Stonebraker, M., Brown, P., Poliakov, A., & Raman, S. (2011). The Architecture of SciDB. In SSDBM (pp. 1–16).
Tang, S., Ma, J., Yan, Z., Zhu, Y., & Khoo, B. C. (2024). Deep Transfer Learning Strategy in Intelligent Fault Diagnosis of Rotating Machinery. Engineering Applications of Artificial Intelligence, 134, 108678.
TV, V., Gupta, P., Malhotra, P., Vig, L., & Shroff, G. (2018, 9). Recurrent Neural Networks for Online Remaining Useful Life Estimation in Ion Mill Etching System. Proceedings of the Annual Conference of the PHM Society, 10(1).
Vartak, M., Subramanyam, H., Lee, W.-E., Viswanathan, S., Husnoo, S., Madden, S., & Zaharia, M. (2016). ModelDB: A System for Machine Learning Model Management. In Proceedings of the workshop on humanin-the-loop data analytics (pp. 14:1–14:3).
Yao, Q., Yang, T., Liu, Z., & Zheng, Z. (2019). Remaining Useful Life Estimation by Empirical Mode Decomposition and Ensemble Deep Convolution Neural Networks. ICPHM.
Ye, R., & Dai, Q. (2021). Implementing transfer learning across different datasets for time series forecasting. Pattern Recognition, 109, 107617.
Yosinski, J., Clune, J., Bengio, Y., & Lipson, H. (2014). How transferable are features in deep neural networks? In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, & K. Q. Weinberger (Eds.), Advances in neural information processing systems (pp. 3320– 3328). Curran Associates, Inc.
Zhang, Y., Xu, F., Frise, E., Wu, S., Yu, B., & Xu, W. (2016). DataLab: A Version Data Management and Analytics System. In BIGDSE (pp. 12–18). ACM.
Zhang, Y., Zhang, T., Jia, Y., Sun, J., Xu, F., & Xu, W. (2017). DataLab: Introducing Software Engineering Thinking into Data Science Education at Scale. In ICSE-SEET (pp. 47–56). IEEE Computer Society.
Zheng, S., Ristovski, K., Farahat, A., & Gupta, C. (2017). Long Short-Term Memory Network for Remaining Useful Life Estimation. ICPHM.
Section
Regular Session Papers