Zobrazeno 1 - 10
of 95
pro vyhledávání: '"Yun, Juyoung"'
Autor:
Yun, Juyoung
In deep learning, Residual Networks (ResNets) have proven effective in addressing the vanishing gradient problem, allowing for the successful training of very deep networks. However, skip connections in ResNets can lead to gradient overlap, where gra
Externí odkaz:
http://arxiv.org/abs/2410.21564
Autor:
Yun, Juyoung
The rapid advancements in deep learning necessitate better training methods for deep neural networks (DNNs). As models grow in complexity, vanishing and exploding gradients impede performance, particularly in skip-connected architectures like Deep Re
Externí odkaz:
http://arxiv.org/abs/2408.01215
Autor:
Yun, Juyoung, Shin, Jungmin
Solar flares, especially C, M, and X class, pose significant risks to satellite operations, communication systems, and power grids. We present a novel approach for predicting extreme solar flares using HMI intensitygrams and magnetograms. By detectin
Externí odkaz:
http://arxiv.org/abs/2405.14750
Autor:
Yun, Juyoung, Shin, Jungmin
In the era of space exploration, coronal holes on the sun play a significant role due to their impact on satellites and aircraft through their open magnetic fields and increased solar wind emissions. This study employs computer vision techniques to d
Externí odkaz:
http://arxiv.org/abs/2405.09802
Autor:
Yun, Juyoung
This research embarks on pioneering the integration of gradient sampling optimization techniques, particularly StochGradAdam, into the pruning process of neural networks. Our main objective is to address the significant challenge of maintaining accur
Externí odkaz:
http://arxiv.org/abs/2312.16020
Autor:
Yun, Juyoung
In the field of deep learning, the prevalence of models initially trained with 32-bit precision is a testament to its robustness and accuracy. However, the continuous evolution of these models often demands further training, which can be resource-int
Externí odkaz:
http://arxiv.org/abs/2311.18587
Autor:
Yun, Juyoung
In this paper, we introduce StochGradAdam, a novel optimizer designed as an extension of the Adam algorithm, incorporating stochastic gradient sampling techniques to improve computational efficiency while maintaining robust performance. StochGradAdam
Externí odkaz:
http://arxiv.org/abs/2310.17042
Autor:
Yun, Juyoung
Activation functions are the linchpins of deep learning, profoundly influencing both the representational capacity and training dynamics of neural networks. They shape not only the nature of representations but also optimize convergence rates and enh
Externí odkaz:
http://arxiv.org/abs/2308.13670
Autor:
Yun, Juyoung
In this research, we address critical concerns related to the numerical instability observed in 16-bit computations of machine learning models. Such instability, particularly when employing popular optimization algorithms like Adam, often leads to un
Externí odkaz:
http://arxiv.org/abs/2307.16189
Autor:
Yun, Juyoung
In this study, we focus on the development and implementation of a comprehensive ensemble of numerical time series forecasting models, collectively referred to as the Group of Numerical Time Series Prediction Model (G-NM). This inclusive set comprise
Externí odkaz:
http://arxiv.org/abs/2306.11667