Efficient IntVec: High recognition rate with reduced computational cost
Autor: | Kunihiko Fukushima |
---|---|
Rok vydání: | 2019 |
Předmět: |
0209 industrial biotechnology
Support Vector Machine business.industry Computer science Cognitive Neuroscience Process (computing) Pattern recognition Neocognitron 02 engineering and technology Class (biology) Pattern Recognition Automated Support vector machine 020901 industrial engineering & automation Artificial Intelligence Pattern recognition (psychology) 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Artificial intelligence Neural Networks Computer Layer (object-oriented design) business |
Zdroj: | Neural networks : the official journal of the International Neural Network Society. 119 |
ISSN: | 1879-2782 |
Popis: | In many deep neural networks for pattern recognition, the input pattern is classified in the deepest layer based on features extracted through intermediate layers. IntVec (interpolating-vector) is known to be a powerful method for this process of classification. Although the recognition error can be made much smaller by IntVec than by WTA (winner-take-all) or even by SVM (support vector machines), IntVec requires a large computational cost. This paper proposes a new method, by which the computational cost by IntVec can be reduced drastically without increasing the recognition error. Although we basically use IntVec for recognition, we substitute it with WTA, which requires much smaller computational cost, under a certain condition. To be more specific, we first try to classify the input vector using WTA. If a class is a complete loser by WTA, we judge it also a loser by IntVec and omit the calculation of IntVec for that class. If a class is an unrivaled winner by WTA, calculation of IntVec itself can be omitted for all classes. |
Databáze: | OpenAIRE |
Externí odkaz: |