Zobrazeno 1 - 10
of 154
pro vyhledávání: '"Zhang, Haizhang"'
We introduce a novel random integration algorithm that boasts both high convergence order and polynomial tractability for functions characterized by sparse frequencies or rapidly decaying Fourier coefficients. Specifically, for integration in periodi
Externí odkaz:
http://arxiv.org/abs/2406.16627
Autor:
Xu, Yuesheng, Zhang, Haizhang
We consider deep neural networks with a Lipschitz continuous activation function and with weight matrices of variable widths. We establish a uniform convergence analysis framework in which sufficient conditions on weight matrices and bias vectors tog
Externí odkaz:
http://arxiv.org/abs/2306.01692
Multi-view attributed graph clustering is an important approach to partition multi-view data based on the attribute feature and adjacent matrices from different views. Some attempts have been made in utilizing Graph Neural Network (GNN), which have a
Externí odkaz:
http://arxiv.org/abs/2211.14987
Autor:
Huang, Wentao, Zhang, Haizhang
Various powerful deep neural network architectures have made great contribution to the exciting successes of deep learning in the past two decades. Among them, deep Residual Networks (ResNets) are of particular importance because they demonstrated gr
Externí odkaz:
http://arxiv.org/abs/2205.06571
Deep neural networks, as a powerful system to represent high dimensional complex functions, play a key role in deep learning. Convergence of deep neural networks is a fundamental issue in building the mathematical foundation for deep learning. We inv
Externí odkaz:
http://arxiv.org/abs/2205.06570
Autor:
Xu, Yuesheng, Zhang, Haizhang
Convergence of deep neural networks as the depth of the networks tends to infinity is fundamental in building the mathematical foundation for deep learning. In a previous study, we investigated this question for deep ReLU networks with a fixed width.
Externí odkaz:
http://arxiv.org/abs/2109.13542
Autor:
Xu, Yuesheng, Zhang, Haizhang
We explore convergence of deep neural networks with the popular ReLU activation function, as the depth of the networks tends to infinity. To this end, we introduce the notion of activation domains and activation matrices of a ReLU network. By replaci
Externí odkaz:
http://arxiv.org/abs/2107.12530
Autor:
Yang, Yunfei, Zhang, Haizhang
Reconstructing a band-limited function from its finite sample data is a fundamental task in signal analysis. A Gaussian regularized Shannon sampling series has been proved to be able to achieve exponential convergence for uniform sampling. Whether su
Externí odkaz:
http://arxiv.org/abs/2106.08647
Autor:
Xu, Yuesheng, Zhang, Haizhang
Publikováno v:
In Neurocomputing 28 February 2024 571
Autor:
Gui, Jie, Zhang, Haizhang
Multi-task learning is an important trend of machine learning in facing the era of artificial intelligence and big data. Despite a large amount of researches on learning rate estimates of various single-task machine learning algorithms, there is litt
Externí odkaz:
http://arxiv.org/abs/2104.00453