Robust Human Face Emotion Classification Using Triplet-Loss-Based Deep CNN Features and SVM

Autor: Irfan Haider, Hyung-Jeong Yang, Guee-Sang Lee, Soo-Hyung Kim
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Sensors, Vol 23, Iss 10, p 4770 (2023)
Druh dokumentu: article
ISSN: 1424-8220
DOI: 10.3390/s23104770
Popis: Human facial emotion detection is one of the challenging tasks in computer vision. Owing to high inter-class variance, it is hard for machine learning models to predict facial emotions accurately. Moreover, a person with several facial emotions increases the diversity and complexity of classification problems. In this paper, we have proposed a novel and intelligent approach for the classification of human facial emotions. The proposed approach comprises customized ResNet18 by employing transfer learning with the integration of triplet loss function (TLF), followed by SVM classification model. Using deep features from a customized ResNet18 trained with triplet loss, the proposed pipeline consists of a face detector used to locate and refine the face bounding box and a classifier to identify the facial expression class of discovered faces. RetinaFace is used to extract the identified face areas from the source image, and a ResNet18 model is trained on cropped face images with triplet loss to retrieve those features. An SVM classifier is used to categorize the facial expression based on the acquired deep characteristics. In this paper, we have proposed a method that can achieve better performance than state-of-the-art (SoTA) methods on JAFFE and MMI datasets. The technique is based on the triplet loss function to generate deep input image features. The proposed method performed well on the JAFFE and MMI datasets with an accuracy of 98.44% and 99.02%, respectively, on seven emotions; meanwhile, the performance of the method needs to be fine-tuned for the FER2013 and AFFECTNET datasets.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje