Vertical-Line Mura Defect Detection for TFT-LCDs

Autor: Chuan-Yu Chang, Abida Khanum, Sheng-Gui Su, Manova Lebaku Moses, Kai-Xiang Liu
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IEEE Access, Vol 12, Pp 158927-158938 (2024)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2024.3486567
Popis: Mura defects, which manifest as irregularities in brightness and color on display screens, present persistent challenges for Thin Film Transistor Liquid Crystal Display (TFT-LCD) manufacturers. While traditional methods effectively detect significant abnormalities in vertical-line inspections, identifying minor defects, particularly Level-1, remains formidable. In this study, we leverage artificial intelligence to address this challenge, focusing specifically on detecting vertical-line mura defects. Given the difficulty in discerning Level-1 defects directly from single images, where higher-level abnormalities are more conspicuous, our approach introduces You Only Look Once-Ghost Attention (YOLO-GA). This advanced YOLOv8-based algorithm is meticulously designed to swiftly and accurately identify vertical lines mura in liquid crystal displays (LCD) images, even amidst complex backgrounds and minor irregularities. To enhance the model’s efficacy, we adopt two pivotal strategies. Firstly, we incorporate the Ghost layer as the backbone and neck network to ensure lightweight deployment while improving feature extraction capabilities, particularly in images with intricate backgrounds. Additionally, we integrate the CBAM (Convolutional Block Attention Module) into the network’s architecture, explicitly targeting vertical line mura detection. This augmentation aims to bolster feature extraction and refine detection, especially for defects within liquid crystal displays (LCDs). The dataset utilized in our study is sourced from AUO Corporation and captured using a real single-direction camera. The dataset contains 107,080 images, divided into an 80:20 ratio for training and validation. Each image has a high resolution of $1624\times 1240$ . Our experimental results show that our approach is highly effective, achieving mAP and F1-Scores of 99.5% and 99.7%, respectively, compared to the baseline model and the comparative attention module. Moreover, the proposed model can significantly reduce the time required to recognize mura defects, fully meeting the manufacturer’s production requirements within a mere 1-second timeframe.
Databáze: Directory of Open Access Journals