Hybrid neural networks for big data classification

Autor: Gerardo Hernández, Germán Téllez, Humberto Sossa, Federico Furlán, Erik Zamora
Rok vydání: 2020
Předmět:
Zdroj: Neurocomputing. 390:327-340
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2019.08.095
Popis: Two new hybrid neural architectures combining morphological neurons and perceptrons are introduced in this paper. The first architecture, called Morphological - Linear Neural Network (MLNN) consists of a hidden layer of morphological neurons and an output layer of classical perceptrons has the capability of extracting features. The second architecture, called Linear-Morphological Neural Network (LMNN) is composed of one or several perceptron layers as a feature extractor, it is then followed by an output layer of morphological neurons for non-linear classification. Both architectures are trained by stochastic gradient descent. One of the main contributions of this paper is to show that the morphological layer offers a greater capacity to extract features than the perceptron layer. This claim is supported both theoretically and experimentally. We prove that the morphological layer possesses a greater capacity per computation unit to segment the 2D input space than the perceptron layer. In other words, adding more hyper-boxes produces more response regions than adding hyperplanes. From an empirical point of view, we test the two new models on 25 standard datasets at low dimensionality and one big data dataset. The result is that MLNN requires a lesser number of learning parameters than the other tested architectures while achieving better accuracies.
Databáze: OpenAIRE