Active learning for gear defect detection in gearboxes
##plugins.themes.bootstrap3.article.main##
##plugins.themes.bootstrap3.article.sidebar##
Abstract
Condition monitoring of gears in gearboxes is crucial to ensure performance and minimizing downtime in many industrial applications including wind turbines and automotive. Monitoring techniques using indirect measurements (i.e. accelerometers, microphones, acoustic emission sensors and encoders, etc.) face challenges, including the defect interpretation and characterization. Vision-based gear condition monitoring, as a direct method to observe gear defects, has the capability to give a precise indication of the starting point of a potential surface failure, but suffers from the image annotations (to train a reliable vision model for automatic defect detection of gears). In this paper, we propose an active learning framework for vision-based condition monitoring, to reduce the human annotation effort by only labelling the most informative examples. In particular, we first train a deep learning model on limited training dataset (annotated randomly) to detect pitting defects. To select which samples have the highest priority to be annotated, we compute the model's uncertainty on all remaining unlabeled examples. Bayesian active learning by disagreement is exploited to estimate the uncertainty of the unlabeled samples. We select the samples with the highest values of uncertainty to be annotated first. Experimental results from defect detection of gears in gearboxes show that with less than 6 times image annotations, we can achieve similar performances.
How to Cite
##plugins.themes.bootstrap3.article.details##
Gearbox, condition monitoring, vision-based, defect detection, active learning, deep learning
turbine condition monitoring based on improved active learning strategy and knn algorithm. IEEE Access, 11, 13545-13553.
Beluch, W., Genewein, T., N¨urnberger, A., & K¨ohler, J.
(2018). The power of ensembles for active learning in image classification. In Proceedings of the ieee conference on computer vision and pattern recognition (pp. 9368–9377).
Boemher, D. E. (2019, 4). Computer vision for gear alignment check and condition monitoring of wind turbine gearboxes.
Chen, J., Zhou, D., Guo, Z., Lin, J., Lyu, C., & Lu, . (2019).
An active learning method based on uncertainty and complexity for gearbox fault diagnosis. IEEE Access, 7, 9022-9031.
Coronado, D., & Fischer, K. (2015). Condition monitoring of wind turbines : State of the art , user experience and recommendations project report..
Feng, K., Ji, J. C., Ni, Q., & Beer, M. (2023). A review of vibration-based gear wear monitoring and prediction techniques. Mech. Syst. Signal Process., 182, 109605. He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In 2016 ieee conference on computer vision and pattern recognition (cvpr) (p. 770-778).
Iakubovskii, P. (2019). Segmentation models pytorch.
https://github.com/qubvel/segmentation models.pytorch. GitHub.
Kirsch, A., Amersfoort, J., & Gal, Y. (2019). Batchbald: Efficient and diverse batch acquisition for deep bayesian active learning.
Li, W., Li, B., Niu, S., Wang, Z., Liu, B., & Niu, T.
(2023). Selecting informative data for defect segmentation from imbalanced datasets via active learning. Advanced Engineering Informatics, 56, 101933.
Lin, T., Doll´ar, P., Girshick, R., He, K., Hariharan, B., & Belongie, S. (2017). Feature pyramid networks for object detection. In 2017 ieee conference on computer vision and pattern recognition (cvpr) (p. 936-944). Miltenovi´c, A., Rakonjac, I., Oarcea, A., Peri´c, M., & Rangelov, D. (2022). Detection and monitoring of pitting progression on gear tooth flank using deep learning. Applied Sciences, 12(11).
Powers, D. (2011). Evaluation: from precision, recall and f-measure to roc, informedness, markedness and correlation. ArXiv, abs/2010.16061.
Qin, Y., Xi, D., & Chen, W. (2023). Gear pitting measurement by multi-scale splicing attention u-net. Chinese Journal of Mechanical Engineering, 36(50). Schlagenhauf, T., & Landwehr, M. (2021). Industrial machine tool component surface defect dataset. Data in Brief, 39, 107643.
Surucu, O., Gadsden, S. A., & Yawney, J. (2023). Condition monitoring using machine learning: A review of theory, applications, and recent advances. Expert Systems with Applications, 221, 119738. Van Maele, D., Poletto, J. C., Neis, P., Ferreira, N., Fauconnier, D., & De Baets, P. (2023). Online vision-assisted condition monitoring of gearboxes. In 8th euro. conf. and exhibition on lubrication, maintenance and tribotech (lubmat 2023). Wan, T., Xu, K., Yu, T., Wang, X., Feng, D., Ding, B., & Wang, H. (2023). A survey of deep active learning for foundation models. Intelligent Computing, 2, 0058.
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
The Prognostic and Health Management Society advocates open-access to scientific data and uses a Creative Commons license for publishing and distributing any papers. A Creative Commons license does not relinquish the author’s copyright; rather it allows them to share some of their rights with any member of the public under certain conditions whilst enjoying full legal protection. By submitting an article to the International Conference of the Prognostics and Health Management Society, the authors agree to be bound by the associated terms and conditions including the following:
As the author, you retain the copyright to your Work. By submitting your Work, you are granting anybody the right to copy, distribute and transmit your Work and to adapt your Work with proper attribution under the terms of the Creative Commons Attribution 3.0 United States license. You assign rights to the Prognostics and Health Management Society to publish and disseminate your Work through electronic and print media if it is accepted for publication. A license note citing the Creative Commons Attribution 3.0 United States License as shown below needs to be placed in the footnote on the first page of the article.
First Author et al. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 United States License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.