Separable Gaussian Neural Networks: Structure, Analysis, and Function Approximations

Autor: Siyuan Xing, Jian-Qiao Sun
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Algorithms, Vol 16, Iss 10, p 453 (2023)
Druh dokumentu: article
ISSN: 1999-4893
DOI: 10.3390/a16100453
Popis: The Gaussian-radial-basis function neural network (GRBFNN) has been a popular choice for interpolation and classification. However, it is computationally intensive when the dimension of the input vector is high. To address this issue, we propose a new feedforward network-separable Gaussian neural network (SGNN) by taking advantage of the separable property of Gaussian-radial-basis functions, which splits input data into multiple columns and sequentially feeds them into parallel layers formed by uni-variate Gaussian functions. This structure reduces the number of neurons from O(Nd) of GRBFNN to O(dN), which exponentially improves the computational speed of SGNN and makes it scale linearly as the input dimension increases. In addition, SGNN can preserve the dominant subspace of the Hessian matrix of GRBFNN in gradient descent training, leading to a similar level of accuracy to GRBFNN. It is experimentally demonstrated that SGNN can achieve an acceleration of 100 times with a similar level of accuracy over GRBFNN on tri-variate function approximations. The SGNN also has better trainability and is more tuning-friendly than DNNs with RuLU and Sigmoid functions. For approximating functions with a complex geometry, SGNN can lead to results that are three orders of magnitude more accurate than those of a RuLU-DNN with twice the number of layers and the number of neurons per layer.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje