Zobrazeno 1 - 10
of 219
pro vyhledávání: '"Kim, Young‐Geun"'
Transformer-based large-scale pre-trained models achieve great success, and fine-tuning, which tunes a pre-trained model on a task-specific dataset, is the standard practice to utilize these models for downstream tasks. Recent work has developed adap
Externí odkaz:
http://arxiv.org/abs/2412.03587
Autor:
Zheng, Xinyuan, Ravid, Orren, Barry, Robert A. J., Kim, Yoojean, Wang, Qian, Kim, Young-geun, Zhu, Xi, He, Xiaofu
Autism spectrum disorders (ASDs) are developmental conditions characterized by restricted interests and difficulties in communication. The complexity of ASD has resulted in a deficiency of objective diagnostic biomarkers. Deep learning methods have g
Externí odkaz:
http://arxiv.org/abs/2410.00068
Autor:
Kim, Gyudong, Ghasemi, Mehdi, Heidari, Soroush, Kim, Seungryong, Kim, Young Geun, Vrudhula, Sarma, Wu, Carole-Jean
Federated Learning (FL) is a practical approach to train deep learning models collaboratively across user-end devices, protecting user privacy by retaining raw data on-device. In FL, participating user-end devices are highly fragmented in terms of ha
Externí odkaz:
http://arxiv.org/abs/2403.04207
Generating samples given a specific label requires estimating conditional distributions. We derive a tractable upper bound of the Wasserstein distance between conditional distributions to lay the theoretical groundwork to learn conditional distributi
Externí odkaz:
http://arxiv.org/abs/2308.10145
Autor:
Kim, Young Geun, Gupta, Udit, McCrabb, Andrew, Son, Yonglak, Bertacco, Valeria, Brooks, David, Wu, Carole-Jean
To improve the environmental implications of the growing demand of computing, future applications need to improve the carbon-efficiency of computing infrastructures. State-of-the-art approaches, however, do not consider the intermittent nature of ren
Externí odkaz:
http://arxiv.org/abs/2304.00404
Autor:
Kim, Young Geun, Wu, Carole-Jean
Federated learning (FL) has emerged as a solution to deal with the risk of privacy leaks in machine learning training. This approach allows a variety of mobile devices to collaboratively train a machine learning model without sharing the raw on-devic
Externí odkaz:
http://arxiv.org/abs/2211.16669
The recently proposed identifiable variational autoencoder (iVAE) framework provides a promising approach for learning latent independent components (ICs). iVAEs use auxiliary covariates to build an identifiable generation structure from covariates t
Externí odkaz:
http://arxiv.org/abs/2202.04206
Autor:
Kim, Young Geun, Wu, Carole-Jean
Federated learning enables a cluster of decentralized mobile devices at the edge to collaboratively train a shared machine learning model, while keeping all the raw training samples on device. This decentralized training approach is demonstrated as a
Externí odkaz:
http://arxiv.org/abs/2107.08147
The Mixup method (Zhang et al. 2018), which uses linearly interpolated data, has emerged as an effective data augmentation tool to improve generalization performance and the robustness to adversarial examples. The motivation is to curtail undesirable
Externí odkaz:
http://arxiv.org/abs/2012.02521
Autor:
Gupta, Udit, Kim, Young Geun, Lee, Sylvia, Tse, Jordan, Lee, Hsien-Hsin S., Wei, Gu-Yeon, Brooks, David, Wu, Carole-Jean
Given recent algorithm, software, and hardware innovation, computing has enabled a plethora of new applications. As computing becomes increasingly ubiquitous, however, so does its environmental impact. This paper brings the issue to the attention of
Externí odkaz:
http://arxiv.org/abs/2011.02839