Does Removing Pooling Layers from Convolutional Neural Networks Improve Results?
Autor: | João Paulo Papa, Danilo Colombo, Thierry Pinheiro Moreira, Claudio Filipi Goncalves dos Santos |
---|---|
Přispěvatelé: | Universidade Federal de São Carlos (UFSCar), Universidade Estadual Paulista (UNESP), Petrobras |
Rok vydání: | 2020 |
Předmět: | |
Zdroj: | Scopus Repositório Institucional da UNESP Universidade Estadual Paulista (UNESP) instacron:UNESP |
ISSN: | 2661-8907 2662-995X |
DOI: | 10.1007/s42979-020-00295-9 |
Popis: | Made available in DSpace on 2022-05-01T11:23:36Z (GMT). No. of bitstreams: 0 Previous issue date: 2020-09-01 Petrobras Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) Due to their number of parameters, convolutional neural networks are known to take long training periods and extended inference time. Learning may take so much computational power that it requires a costly machine and, sometimes, weeks for training. In this context, there is a trend already in motion to replace convolutional pooling layers for a stride operation in the previous layer to save time. In this work, we evaluate the speedup of such an approach and how it trades off with accuracy loss in multiple computer vision domains, deep neural architectures, and datasets. The results showed significant acceleration with an almost negligible loss in accuracy, when any, which is a further indication that convolutional pooling on deep learning performs redundant calculations. UFSCar Federal University of São Carlos UNESP State University of Sao Paulo Cenpes Petróleo Brasileiro S.A. Petrobras, RJ UNESP State University of Sao Paulo Petrobras: #2017/00285-6 FAPESP: #2017/25908-6 FAPESP: #2018/15597-6 FAPESP: #2019/07665-4 CNPq: #307066/2017-7 CNPq: #427968/2018-6 FAPESP: \#2013/07375-0 FAPESP: \#2014/12236-1 |
Databáze: | OpenAIRE |
Externí odkaz: |