Bounded Residual Gradient Networks (BReG-Net) for Facial Affect Computing
Autor: | Pooran Negi, Mohammad H. Mahoor, Behzad Hasani |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2019 |
Předmět: |
FOS: Computer and information sciences
Artificial neural network Computer science business.industry Computer Vision and Pattern Recognition (cs.CV) 05 social sciences Connection (vector bundle) Computer Science - Computer Vision and Pattern Recognition 020207 software engineering 02 engineering and technology Function (mathematics) Residual Net (mathematics) Bounded function 0202 electrical engineering electronic engineering information engineering 0501 psychology and cognitive sciences Differentiable function Artificial intelligence business Algorithm Categorical variable 050107 human factors |
Zdroj: | FG |
Popis: | Residual-based neural networks have shown remarkable results in various visual recognition tasks including Facial Expression Recognition (FER). Despite the tremendous efforts have been made to improve the performance of FER systems using DNNs, existing methods are not generalizable enough for practical applications. This paper introduces Bounded Residual Gradient Networks (BReG-Net) for facial expression recognition, in which the shortcut connection between the input and the output of the ResNet module is replaced with a differentiable function with a bounded gradient. This configuration prevents the network from facing the vanishing or exploding gradient problem. We show that utilizing such non-linear units will result in shallower networks with better performance. Further, by using a weighted loss function which gives a higher priority to less represented categories, we can achieve an overall better recognition rate. The results of our experiments show that BReG-Nets outperform state-of-the-art methods on three publicly available facial databases in the wild, on both the categorical and dimensional models of affect. To appear in 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019) |
Databáze: | OpenAIRE |
Externí odkaz: |