Zobrazeno 1 - 10
of 366
pro vyhledávání: '"Hongsun An"'
In this work, we investigate a stochastic gradient descent method for solving inverse problems that can be written as systems of linear or nonlinear ill-posed equations in Banach spaces. The method uses only a randomly selected equation at each itera
Externí odkaz:
http://arxiv.org/abs/2409.04973
Publikováno v:
IEEE Access, Vol 9, Pp 109976-109985 (2021)
In this paper, we study a waveform design based on time-reversal (TR) for multi-user wireless power transfer (WPT) systems in multipath channels. The existing waveforms for WPT using the non-linear energy harvesting (EH) model have been designed in t
Externí odkaz:
https://doaj.org/article/04cdc3f56942402898f5976f9e1839c1
Autor:
Yim, Jinkyu, Song, Jaeyong, Choi, Yerim, Lee, Jaebeen, Jung, Jaewon, Jang, Hongsun, Lee, Jinho
Training large language models (LLMs) is known to be challenging because of the huge computational and memory capacity requirements. To address these issues, it is common to use a cluster of GPUs with 3D parallelism, which splits a model along the da
Externí odkaz:
http://arxiv.org/abs/2405.18093
Adversarial robustness of the neural network is a significant concern when it is applied to security-critical domains. In this situation, adversarial distillation is a promising option which aims to distill the robustness of the teacher network to im
Externí odkaz:
http://arxiv.org/abs/2403.06668
The recent huge advance of Large Language Models (LLMs) is mainly driven by the increase in the number of parameters. This has led to substantial memory capacity requirements, necessitating the use of dozens of GPUs just to meet the capacity. One pop
Externí odkaz:
http://arxiv.org/abs/2403.06664
Graph neural networks (GNNs) are one of the rapidly growing fields within deep learning. While many distributed GNN training frameworks have been proposed to increase the training throughput, they face three limitations when applied to multi-server c
Externí odkaz:
http://arxiv.org/abs/2311.06837
Training large deep neural network models is highly challenging due to their tremendous computational and memory requirements. Blockwise distillation provides one promising method towards faster convergence by splitting a large model into multiple sm
Externí odkaz:
http://arxiv.org/abs/2301.12443
Autor:
Song, Jaeyong, Yim, Jinkyu, Jung, Jaewon, Jang, Hongsun, Kim, Hyung-Jin, Kim, Youngsok, Lee, Jinho
In training of modern large natural language processing (NLP) models, it has become a common practice to split models using 3D parallelism to multiple GPUs. Such technique, however, suffers from a high overhead of inter-node communication. Compressin
Externí odkaz:
http://arxiv.org/abs/2301.09830
Autor:
Hongsun Kim, Jun Ho Lee, Su Ryeun Chung, Pyo Won Park, Taek Kyu Park, I-Seok Kang, June Huh, Duk-Kyung Kim, Yang Hyun Cho, Kiick Sung
Publikováno v:
Frontiers in Cardiovascular Medicine, Vol 11 (2024)
BackgroundThis study aimed to investigate the influence of early diagnosis (ED) on surgical outcomes in patients definitively diagnosed with Loeys-Dietz syndrome (LDS).MethodsA retrospective review was conducted on 38 patients with LDS who underwent
Externí odkaz:
https://doaj.org/article/34dc6047d5ec47f88893a589d8aea3e7
Autor:
Kim, Hongsun, Lee, Ok Jeong, Lee, Jun Ho, Kim, Yun Jin, Chung, Su Ryeun, Park, Taek Kyu, Kim, Duk-Kyung, Park, Pyo Won, Sung, Kiick
Publikováno v:
In The Journal of Thoracic and Cardiovascular Surgery September 2024