Zobrazeno 1 - 10
of 317
pro vyhledávání: '"Moon, Jaekyun"'
Research interests in the robustness of deep neural networks against domain shifts have been rapidly increasing in recent years. Most existing works, however, focus on improving the accuracy of the model, not the calibration performance which is anot
Externí odkaz:
http://arxiv.org/abs/2402.15019
Autor:
Rahimi, Mohammad Mahdi, Bhatti, Hasnain Irshad, Park, Younghyun, Kousar, Humaira, Moon, Jaekyun
Federated Learning (FL) is a decentralized machine learning paradigm that enables collaborative model training across dispersed nodes without having to force individual nodes to share data. However, its broad adoption is hindered by the high communic
Externí odkaz:
http://arxiv.org/abs/2311.07485
While multi-exit neural networks are regarded as a promising solution for making efficient inference via early exits, combating adversarial attacks remains a challenging problem. In multi-exit networks, due to the high dependency among different subm
Externí odkaz:
http://arxiv.org/abs/2311.00428
Autor:
Park, Jungwuk, Han, Dong-Jun, Kim, Jinho, Wang, Shiqiang, Brinton, Christopher G., Moon, Jaekyun
Traditional federated learning (FL) algorithms operate under the assumption that the data distributions at training (source domains) and testing (target domain) are the same. The fact that domain shifts often occur in practice necessitates equipping
Externí odkaz:
http://arxiv.org/abs/2311.00227
In domain generalization (DG), the target domain is unknown when the model is being trained, and the trained model should successfully work on an arbitrary (and possibly unseen) target domain during inference. This is a difficult problem, and despite
Externí odkaz:
http://arxiv.org/abs/2306.04911
A fundamental challenge to providing edge-AI services is the need for a machine learning (ML) model that achieves personalization (i.e., to individual clients) and generalization (i.e., to unseen data) properties concurrently. Existing techniques in
Externí odkaz:
http://arxiv.org/abs/2212.08343
Autor:
Bhatti, Hasnain Irshad, Moon, Jaekyun
Locally supervised learning aims to train a neural network based on a local estimation of the global loss function at each decoupled module of the network. Auxiliary networks are typically appended to the modules to approximate the gradient updates b
Externí odkaz:
http://arxiv.org/abs/2208.00821
Few-shot learning allows machines to classify novel classes using only a few labeled samples. Recently, few-shot segmentation aiming at semantic segmentation on low sample data has also seen great interest. In this paper, we propose a learnable modul
Externí odkaz:
http://arxiv.org/abs/2202.06498
Autor:
Sohn, Jy-yong, Shang, Liang, Chen, Hongxu, Moon, Jaekyun, Papailiopoulos, Dimitris, Lee, Kangwook
Mixup is a data augmentation method that generates new data points by mixing a pair of input data. While mixup generally improves the prediction performance, it sometimes degrades the performance. In this paper, we first identify the main causes of t
Externí odkaz:
http://arxiv.org/abs/2201.02354
Federated learning has been spotlighted as a way to train neural networks using distributed data with no need for individual nodes to share data. Unfortunately, it has also been shown that adversaries may be able to extract local data contents off mo
Externí odkaz:
http://arxiv.org/abs/2012.05433