Shared Temporal Attention Transformer for Remaining Useful Lifetime Estimation

Autor: Gavneet Singh Chadha, Sayed Rafay Bin Shah, Andreas Schwung, Steven X. Ding
Jazyk: angličtina
Rok vydání: 2022
Předmět:
Zdroj: IEEE Access, Vol 10, Pp 74244-74258 (2022)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2022.3187702
Popis: This paper proposes a novel deep learning architecture for estimating the remaining useful lifetime (RUL) of industrial components, which solely relies on the recently developed transformer architectures. The RUL estimation resorts to analysing degradation patterns within multivariate time series signals. Hence, we propose a novel shared temporal attention block that allows detecting RUL patterns with the progress of time. Furthermore, we develop a split-feature attention block that enables attending to features from different sensor channels. The proposed shared temporal attention layer in the encoder fulfils the goal of attending to temporal degradation patterns in the individual sensor signals before creating a shared correlation across the feature range. We develop two transformer architectures that are specifically designed to operate with multivariate time series data based on these novel attention blocks. We apply the architectures to the well known C-MAPSS benchmark dataset and provide various hyperparameter studies to analyse their impact on the performance. In addition, we provide a thorough comparison with recently presented state-of-the-art approaches and show that the proposed transformer architectures outperform the existing methods by a considerable margin.
Databáze: Directory of Open Access Journals