Zobrazeno 1 - 10
of 43
pro vyhledávání: '"Reisser, Matthias"'
Autor:
Cho, Wonguk, Choi, Seokeon, Das, Debasmit, Reisser, Matthias, Kim, Taesup, Yun, Sungrack, Porikli, Fatih
Recent advancements in text-to-image diffusion models have enabled the personalization of these models to generate custom images from textual prompts. This paper presents an efficient LoRA-based personalization approach for on-device subject-driven g
Externí odkaz:
http://arxiv.org/abs/2411.01179
The proliferation of edge devices has brought Federated Learning (FL) to the forefront as a promising paradigm for decentralized and collaborative model training while preserving the privacy of clients' data. However, FL struggles with a significant
Externí odkaz:
http://arxiv.org/abs/2405.07925
We investigate contrastive learning in the federated setting through the lens of SimCLR and multi-view mutual information maximization. In doing so, we uncover a connection between contrastive representation learning and user verification; by adding
Externí odkaz:
http://arxiv.org/abs/2405.02081
Well-tuned hyperparameters are crucial for obtaining good generalization behavior in neural networks. They can enforce appropriate inductive biases, regularize the model and improve performance -- especially in the presence of limited data. In this w
Externí odkaz:
http://arxiv.org/abs/2304.14766
Federated Learning (FL) is a machine learning paradigm to distributively learn machine learning models from decentralized data that remains on-device. Despite the success of standard Federated optimization methods, such as Federated Averaging (FedAvg
Externí odkaz:
http://arxiv.org/abs/2206.10844
Federated learning describes the distributed training of models across multiple clients while keeping the data private on-device. In this work, we view the server-orchestrated federated learning process as a hierarchical latent variable model where t
Externí odkaz:
http://arxiv.org/abs/2111.10192
Privacy and communication efficiency are important challenges in federated training of neural networks, and combining them is still an open problem. In this work, we develop a method that unifies highly compressed communication and differential priva
Externí odkaz:
http://arxiv.org/abs/2111.05454
Federated learning (FL) has emerged as the predominant approach for collaborative training of neural network models across multiple users, without the need to gather the data at a central location. One of the important challenges in this setting is d
Externí odkaz:
http://arxiv.org/abs/2107.06724
Neural network quantization has become an important research area due to its great impact on deployment of large models on resource constrained devices. In order to train networks that can be effectively discretized without loss of performance, we in
Externí odkaz:
http://arxiv.org/abs/1810.01875
Publikováno v:
Nature Communications, 9(1), 5218 (2018)
Zygotic genome activation (ZGA), the onset of transcription after initial quiescence, is a major developmental step in many species, which occurs after ten cell divisions in Zebrafish embryos. How transcription factor-chromatin interactions evolve du
Externí odkaz:
http://arxiv.org/abs/1710.03539