Autor: |
Qilang Min, Juan-Juan He, Piaoyao Yu, Yue Fu |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
IEEE Access, Vol 11, Pp 46015-46025 (2023) |
Druh dokumentu: |
article |
ISSN: |
2169-3536 |
DOI: |
10.1109/ACCESS.2023.3274481 |
Popis: |
Incremental learning-based fault diagnosis systems (IFD) are widely used because of their ability to handle constantly updated fault data and types. However, the catastrophic forgetting problem remains the most crucial contemporary challenge facing IFD. This paper proposes an incremental fault diagnosis method based on metric feature distillation (MFD) and improved sample memory to solve this problem. First, the metric feature distillation is designed with metric learning methods and feature distillation. It uses distillation and triplet loss to constrain the network parameters of old and new tasks in the same feasible region, effectively alleviating catastrophic forgetting. Then, for a small amount of data that can be stored scenario, an improved sample memory strategy is introduced to reduce catastrophic forgetting further, called the center and hard sample memory (CAHM). It can better preserve the global information of the data, reducing the forgetting of old data information that needs to be preserved during the training process. Experimental results on CWRU and MFPT datasets verify the proposed method’s effectiveness. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|