Training neural network classifiers through Bayes risk minimization applying unidimensional Parzen windows

Autor: Marcelino Lázaro, Aníbal R. Figueiras-Vidal, Monson H. Hayes
Přispěvatelé: Ministerio de Economía y Competitividad (España)
Jazyk: angličtina
Rok vydání: 2018
Předmět:
Zdroj: e-Archivo. Repositorio Institucional de la Universidad Carlos III de Madrid
instname
Popis: A new training algorithm for neural networks in binary classification problems is presented. It is based on the minimization of an estimate of the Bayes risk by using Parzen windows applied to the final one-dimensional nonlinear transformation of the samples to estimate the probability of classification error. This leads to a very general approach to error minimization and training, where the risk that is to be minimized is defined in terms of integrated one-dimensional Parzen windows, and the gradient descent algorithm used to minimize this risk is a function of the window that is used. By relaxing the constraints that are typically applied to Parzen windows when used for probability density function estimation, for example by allowing them to be non-symmetric or possibly infinite in duration, an entirely new set of training algorithms emerge. In particular, different Parzen windows lead to different cost functions, and some interesting relationships with classical training methods are discovered. Experiments with synthetic and real benchmark datasets show that with the appropriate choice of window, fitted to the specific problem, it is possible to improve the performance of neural network classifiers over those that are trained using classical methods. (C) 2017 Elsevier Ltd. All rights reserved. This work was partly supported by Grant TEC-2015-67719-P “Macro-ADOBE” (Spain MINECO/EU FSE, FEDER), and network TIN 2015-70808-REDT, “DAMA” (MINECO) (M. Lázaro and A.R. Figueiras-Vidal), and by Prof. Monson Hayes’ Banco de Santander-UC3M Chair of Excellence, 2015.
Databáze: OpenAIRE