Popis: |
Encoding facial expressions via action units (AUs) has been found to be effective in resolving the ambiguity issue among different expressions. Therefore, AU detection plays an important role for emotion analysis. While a number of AU detection methods have been proposed for common facial expressions, there is very limited study for micro-expression AU detection. Micro-expression AU detection is challenging because of the weakness of micro-expression appearance and the spontaneous characteristic leading to difficult collection, thus has small-scale datasets. In this paper, we focus on the micro-expression AU detection and expect to contribute to the community. To address above issues, a novel dual-view attentive similarity-preserving distillation method is proposed for robust micro-expression AU detection by leveraging massive facial expressions in the wild. Through such an attentive similarity-preserving distillation method, we break the domain shift problem and essential AU knowledge from common facial AUs is efficiently distilled. Furthermore, considering that the generalization ability of teacher network is important for knowledge distillation, a semi-supervised co-training approach is developed to construct a generalized teacher network for learning discriminative AU representation. Extensive experiments have demonstrated that our proposed knowledge distillation method can effectively distill and transfer the cross-domain knowledge for robust micro-expression AU detection. |