Two-layer networks with the ReLUk activation function: Barron spaces and derivative approximation.

Autor: Li, Yuanyuan, Lu, Shuai, Mathé, Peter, Pereverzev, Sergei V.
Předmět:
Zdroj: Numerische Mathematik; Feb2024, Vol. 156 Issue 1, p319-344, 26p
Abstrakt: We investigate the use of two-layer networks with the rectified power unit, which is called the ReLU k activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the ReLU k activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index