Zobrazeno 1 - 3
of 3
pro vyhledávání: '"Inchul Yoo"'
Publikováno v:
Applied Sciences, Vol 13, Iss 21, p 11730 (2023)
Stochastic gradient descent (SGD) is an optimization method typically used in deep learning to train deep neural network (DNN) models. In recent studies for DNN training, pipeline parallelism, a type of model parallelism, is proposed to accelerate SG
Externí odkaz:
https://doaj.org/article/08e927370db64a319afb66ce0f96d4bc
Publikováno v:
VTC Fall
In this paper, a relay selection issue is studied to improve data rate performance in a multi-user environment. Previous opportunistic relay selection protocols only consider channel condition in relay selection. However, the target performance is af
Publikováno v:
VTC Fall
In this paper, we deal with a relay selection issue to improve throughput performance under signaling overhead and delay constrained system. The proposed relay selection protocol, which is based on analytic hierarchy process (AHP), adopts signal-to-