Mixed-pooling-dropout for convolutional neural network regularization.

Autor: Ait Skourt, Brahim, El Hassani, Abdelhamid, Majda, Aicha
Předmět:
Zdroj: Journal of King Saud University - Computer & Information Sciences; Sep2022:Part A, Vol. 34 Issue 8, p4756-4762, 7p
Abstrakt: Deep neural networks are the most used machine learning systems in the literature, for they are able to train huge amounts of data with a large number of parameters in a very effective way. However, one of the problems that such networks face is overfitting. There are many ways to address the overfitting issue, one of which is regularization using the dropout function. The use of dropout has the benefit of using a combination of different networks in one architecture and preventing units from co-adapting in an excessive way. The dropout function is known to work well in fully-connected layers as well as in pooling layers. In this work, we propose a novel method called Mixed-Pooling-Dropout that adapts the dropout function with a mixed-pooling strategy. The dropout operation is represented by a binary mask with each element drawn independently from a Bernoulli distribution. Experimental results show that our proposed method outperforms conventional pooling methods as well as the max-pooling-dropout method with an interesting margin (0.926 vs 0.868) regardless of the retaining probability. [ABSTRACT FROM AUTHOR]
Databáze: Supplemental Index