Zobrazeno 1 - 10
of 27
pro vyhledávání: '"Nakada, Ryumei"'
Autor:
Zhong, Yibo, Jiang, Haoxiang, Li, Lincan, Nakada, Ryumei, Liu, Tianci, Zhang, Linjun, Yao, Huaxiu, Wang, Haoyu
Fine-tuning pre-trained models is crucial for adapting large models to downstream tasks, often delivering state-of-the-art performance. However, fine-tuning all model parameters is resource-intensive and laborious, leading to the emergence of paramet
Externí odkaz:
http://arxiv.org/abs/2410.01870
Imbalanced data and spurious correlations are common challenges in machine learning and data science. Oversampling, which artificially increases the number of instances in the underrepresented classes, has been widely adopted to tackle these challeng
Externí odkaz:
http://arxiv.org/abs/2406.03628
Differentially private federated learning is crucial for maintaining privacy in distributed environments. This paper investigates the challenges of high-dimensional estimation and inference under the constraints of differential privacy. First, we stu
Externí odkaz:
http://arxiv.org/abs/2404.16287
Electronic health record (EHR) systems contain a wealth of multimodal clinical data including structured data like clinical codes and unstructured data such as clinical notes. However, many existing EHR-focused studies has traditionally either concen
Externí odkaz:
http://arxiv.org/abs/2403.14926
The surge in multimodal AI's success has sparked concerns over data privacy in vision-and-language tasks. While CLIP has revolutionized multimodal learning through joint training on images and text, its potential to unintentionally disclose sensitive
Externí odkaz:
http://arxiv.org/abs/2306.08173
Language-supervised vision models have recently attracted great attention in computer vision. A common approach to build such models is to use contrastive learning on paired data across the two modalities, as exemplified by Contrastive Language-Image
Externí odkaz:
http://arxiv.org/abs/2302.06232
Contrastive learning has achieved state-of-the-art performance in various self-supervised learning tasks and even outperforms its supervised counterpart. Despite its empirical success, theoretical understanding of the superiority of contrastive learn
Externí odkaz:
http://arxiv.org/abs/2110.02473
Autor:
Nakada, Ryumei, Imaizumi, Masaaki
We investigate the asymptotic risk of a general class of overparameterized likelihood models, including deep models. The recent empirical success of large-scale models has motivated several theoretical studies to investigate a scenario wherein both t
Externí odkaz:
http://arxiv.org/abs/2103.00500
Autor:
Nakada, Ryumei, Imaizumi, Masaaki
Publikováno v:
Journal of Machine Learning Research, 21(174), 2020
In this study, we prove that an intrinsic low dimensionality of covariates is the main factor that determines the performance of deep neural networks (DNNs). DNNs generally provide outstanding empirical performance. Hence, numerous studies have activ
Externí odkaz:
http://arxiv.org/abs/1907.02177
Publikováno v:
In Journal of Multivariate Analysis May 2021 183