Strategies for improving neural net generalisation

Autor: Derek Partridge, Niall Griffith
Rok vydání: 1995
Předmět:
Zdroj: Neural Computing & Applications. 3:27-37
ISSN: 1433-3058
0941-0643
DOI: 10.1007/bf01414174
Popis: We address the problem of training multilayer perceptrons to instantiate a target function. In particular, we explore the accuracy of the trained network on a test set of previously unseen patterns — the generalisation ability of the trained network. We systematically evaluate alternative strategies designed to improve the generalisation performance. The basic idea is to generate a diverse set of networks, each of which is designed to be an implementation of the target function. We then have a set of trained, alternative versions — a version set. The goal is to achieve ‘useful diversity’ within this set, and thus generate potential for improved generalisation performance of the set as a wholewhen compared to the performance of any individual version. We define this notion of ‘useful diversity’, we define a metric for it, we explore a number of ways of generating it, and we present the results of an empirical study of a number of strategies for exploiting it to achieve maximum generalisation performance. The strategies encompass statistical measures as well as a ‘selectornet’ approach which proves to be particularly promising. The selector net is a form of ‘metanet’ that operates in conjunction with a version set.
Databáze: OpenAIRE