Universal Approximation Theorem for Neural Networks

Autor: Nishijima, Takato
Rok vydání: 2021
Předmět:
Druh dokumentu: Working Paper
Popis: Is there any theoretical guarantee for the approximation ability of neural networks? The answer to this question is the "Universal Approximation Theorem for Neural Networks". This theorem states that a neural network is dense in a certain function space under an appropriate setting. This paper is a comprehensive explanation of the universal approximation theorem for feedforward neural networks, its approximation rate problem (the relation between the number of intermediate units and the approximation error), and Barron space in Japanese.
Comment: 118 pages, in Japanese
Databáze: arXiv