Autor: |
Naigong Yu, Huaisheng Chen, Qiao Xu, Mohammad Mehedi Hasan, Ouattara Sie |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
CAAI Transactions on Intelligence Technology, Vol 8, Iss 3, Pp 1029-1042 (2023) |
Druh dokumentu: |
article |
ISSN: |
2468-2322 |
DOI: |
10.1049/cit2.12126 |
Popis: |
Abstract Accurately identifying defect patterns in wafer maps can help engineers find abnormal failure factors in production lines. During the wafer testing stage, deep learning methods are widely used in wafer defect detection due to their powerful feature extraction capabilities. However, most of the current wafer defect patterns classification models have high complexity and slow detection speed, which are difficult to apply in the actual wafer production process. In addition, there is a data imbalance in the wafer dataset that seriously affects the training results of the model. To reduce the complexity of the deep model without affecting the wafer feature expression, this paper adjusts the structure of the dense block in the PeleeNet network and proposes a lightweight network WM‐PeleeNet based on the PeleeNet module. In addition, to reduce the impact of data imbalance on model training, this paper proposes a wafer data augmentation method based on a convolutional autoencoder by adding random Gaussian noise to the hidden layer. The method proposed in this paper has an average accuracy of 95.4% on the WM‐811K wafer dataset with only 173.643 KB of the parameters and 316.194 M of FLOPs, and takes only 22.99 s to detect 1000 wafer pictures. Compared with the original PeleeNet network without optimization, the number of parameters and FLOPs are reduced by 92.68% and 58.85%, respectively. Data augmentation on the minority class wafer map improves the average classification accuracy by 1.8% on the WM‐811K dataset. At the same time, the recognition accuracy of minority classes such as Scratch pattern and Donut pattern are significantly improved. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|