Zobrazeno 1 - 10
of 4 047
pro vyhledávání: '"KIM, SANG WOO"'
Autor:
Kim, Bum Jun, Kim, Sang Woo
Regularization of deep neural networks has been an important issue to achieve higher generalization performance without overfitting problems. Although the popular method of Dropout provides a regularization effect, it causes inconsistent properties i
Externí odkaz:
http://arxiv.org/abs/2409.16630
The electroweak monopole, when coupled to gravity, turns to the Reissner-Nordstrom type primordial magnetic blackhole whose mass is bounded below, with the lower bound $M_P \sqrt \alpha$. This changes the overall picture of the monopole production me
Externí odkaz:
http://arxiv.org/abs/2408.05531
Dynamical systems are often time-varying, whose modeling requires a function that evolves with respect to time. Recent studies such as the neural ordinary differential equation proposed a time-dependent neural network, which provides a neural network
Externí odkaz:
http://arxiv.org/abs/2405.14126
Autor:
Kim, Bum Jun, Kim, Sang Woo
Vision transformers (ViTs) have demonstrated remarkable performance in a variety of vision tasks. Despite their promising capabilities, training a ViT requires a large amount of diverse data. Several studies empirically found that using rich data aug
Externí odkaz:
http://arxiv.org/abs/2405.14115
Autor:
Kim, Bum Jun, Kim, Sang Woo
Deep neural networks have exhibited remarkable performance in a variety of computer vision fields, especially in semantic segmentation tasks. Their success is often attributed to multi-level feature fusion, which enables them to understand both globa
Externí odkaz:
http://arxiv.org/abs/2402.01149
The latest advances in deep learning have facilitated the development of highly accurate monocular depth estimation models. However, when training a monocular depth estimation network, practitioners and researchers have observed not a number (NaN) lo
Externí odkaz:
http://arxiv.org/abs/2311.03938
DeepLab is a widely used deep neural network for semantic segmentation, whose success is attributed to its parallel architecture called atrous spatial pyramid pooling (ASPP). ASPP uses multiple atrous convolutions with different atrous rates to extra
Externí odkaz:
http://arxiv.org/abs/2307.14179
Vision transformers (ViTs) that model an image as a sequence of partitioned patches have shown notable performance in diverse vision tasks. Because partitioning patches eliminates the image structure, to reflect the order of patches, ViTs utilize an
Externí odkaz:
http://arxiv.org/abs/2305.04722
For the stable optimization of deep neural networks, regularization methods such as dropout and batch normalization have been used in various tasks. Nevertheless, the correct position to apply dropout has rarely been discussed, and different position
Externí odkaz:
http://arxiv.org/abs/2302.06112
Recently, various normalization layers have been proposed to stabilize the training of deep neural networks. Among them, group normalization is a generalization of layer normalization and instance normalization by allowing a degree of freedom in the
Externí odkaz:
http://arxiv.org/abs/2302.03193