Enhancing Noisy Label Facial Expression Recognition With Split and Merge Consistency Regularization

Autor: Jihyun Kim, Junehyoung Kwon, Mihyeon Kim, Eunju Lee, Youngbin Kim
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: IEEE Access, Vol 11, Pp 140496-140505 (2023)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2023.3339763
Popis: Facial expression recognition (FER) has been extensively studied in various applications over the past few years. However, in real facial expression datasets, labels can become noisy due to the ambiguity of expressions, the similarity between classes, and the subjectivity of annotators. These noisy labels negatively affect FER and significantly reduce classification performance. In previous methods, overfitting can occur as the noise ratio increases. To solve this problem, we propose the split and merge consistency regularization (SMEC) method that is robust to noisy labels by examining various image regions rather than just one part of facial expression images without negatively affecting the meaning. We split facial expression images into two images and input them into the backbone network to extract class activation maps (CAMs). This approach merges two CAMs and improves robustness to noisy labels by normalizing the consistency between the CAM of the original image and the merged CAM. The proposed SMEC method aims to improve FER performance and robustness against highly noisy labels by preventing the model from focusing on only a single part without losing the semantics of the facial expression images. The SMEC method demonstrates robust performance over state-of-the-art noisy label FER models on an unbalanced facial expression dataset called the real-world affective faces database (RAF-DB) regarding class-wise accuracy for clean and noisy labels, even at severe noise rates of 40% to 60%.
Databáze: Directory of Open Access Journals