Temporal-Contrastive Appearance Network for Facial Expression Recognition
Autor: | Zi-Jun Li, Li-Chen Fu, Yu-Hung Liu, Yu-Huan Yang, Tso-Hsin Yeh, An-Sheng Liu |
---|---|
Rok vydání: | 2018 |
Předmět: |
Facial expression
Sequence Computer science business.industry Pattern recognition 02 engineering and technology Function (mathematics) Convolutional neural network 0202 electrical engineering electronic engineering information engineering Feature (machine learning) 020201 artificial intelligence & image processing Artificial intelligence business Affective computing Representation (mathematics) |
Zdroj: | SMC |
DOI: | 10.1109/smc.2018.00405 |
Popis: | Facial expression recognition(FER) is a challenging task even for human since individuals have their own way to express their feelings with different intensity. In order to extract commonality of facial expressions from different individuals, personality effect of individual needs to be minimized as much as possible. In this paper, we present a temporal-contrastive appearance network (TCAN) that utilizes the temporal feature to remove the personality effect. The high level feature is extracted from a video consisting of a sequence of frames through a proposed by convolutional neural networks (CNN). In order to let our CNN framework be able to extract similar features from adjacent frames, special loss function is introduced. Moreover, the neutral and peak expression frames are identified through comparison of distances among frames. Then, facial expressions can be classified by the so-called contrastive representation between neutral and peak expressions. We conducted our experiment in the most widely used databases (CK+ and Oulu-CASIA) for facial expression recognition. The experiment results show that the proposed method outperforms those from the state-of-the-art methods. |
Databáze: | OpenAIRE |
Externí odkaz: |