Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Zhang, Luna M."'
Autor:
Zhang, Luna M.
A traditional artificial neural network (ANN) is normally trained slowly by a gradient descent algorithm, such as the backpropagation algorithm, since a large number of hyperparameters of the ANN need to be fine-tuned with many training epochs. Since
Externí odkaz:
http://arxiv.org/abs/2002.04458
Autor:
Zhang, Luna M.
Traditionally, an artificial neural network (ANN) is trained slowly by a gradient descent algorithm such as the backpropagation algorithm since a large number of hyperparameters of the ANN need to be fine-tuned with many training epochs. To highly sp
Externí odkaz:
http://arxiv.org/abs/2001.08886
Autor:
Zhang, Luna M.
In recent years, there have been many popular Convolutional Neural Networks (CNNs), such as Google's Inception-V4, that have performed very well for various image classification problems. These commonly used CNN models usually use the same activation
Externí odkaz:
http://arxiv.org/abs/1906.11912
Autor:
Zhang, Luna M.
Convolutional Neural Networks (CNNs) usually use the same activation function, such as RELU, for all convolutional layers. There are performance limitations of just using RELU. In order to achieve better classification performance, reduce training an
Externí odkaz:
http://arxiv.org/abs/1811.11996
Autor:
Zhang, Luna M.
Traditional Convolutional Neural Networks (CNNs) typically use the same activation function (usually ReLU) for all neurons with non-linear mapping operations. For example, the deep convolutional architecture Inception-v4 uses ReLU. To improve the cla
Externí odkaz:
http://arxiv.org/abs/1805.11788
Autor:
Zhang, Luna M.
Publikováno v:
2015 IEEE International Conference on Big Data (Big Data); 2015, p2849-2851, 3p