Improving maintenance knowledge intelligence using text data is challenging since the maintenance information is mainly recorded as text. To unlock the knowledge from the maintenance text, a decision-making solution based on retrieving similar cases to help solve new maintenance problems is proposed. In this work, an unsupervised domain fine-tuning technique, Transformer-based Sequential Denoising Auto-Encoder (TSDAE) is used to fine-tune the BERT (Bidirectional Encoder Representations from Transformers) model on domain-specific corpora composed of the Maintenance Work Orders (MWOs). Unsupervised fine-tuning helped the BERT model to adapt MWOs text. Results indicate fine-tuned BERT model can generate semantic matches between MWOs regardless of the complex nature of maintenance text.
How to Cite
Prognostics and Health Management (PHM), Decision Support System, Natural Language Processing, Technical Language Processing, BERT
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
The Prognostic and Health Management Society advocates open-access to scientific data and uses a Creative Commons license for publishing and distributing any papers. A Creative Commons license does not relinquish the author’s copyright; rather it allows them to share some of their rights with any member of the public under certain conditions whilst enjoying full legal protection. By submitting an article to the International Conference of the Prognostics and Health Management Society, the authors agree to be bound by the associated terms and conditions including the following:
As the author, you retain the copyright to your Work. By submitting your Work, you are granting anybody the right to copy, distribute and transmit your Work and to adapt your Work with proper attribution under the terms of the Creative Commons Attribution 3.0 United States license. You assign rights to the Prognostics and Health Management Society to publish and disseminate your Work through electronic and print media if it is accepted for publication. A license note citing the Creative Commons Attribution 3.0 United States License as shown below needs to be placed in the footnote on the first page of the article.
First Author et al. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 United States License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.