Zobrazeno 1 - 10
of 131
pro vyhledávání: '"Dimitriadis, Dimitrios"'
Autor:
Mushtaq, Erum, Yaldiz, Duygu Nur, Bakman, Yavuz Faruk, Ding, Jie, Tao, Chenyang, Dimitriadis, Dimitrios, Avestimehr, Salman
Continual self-supervised learning (CSSL) learns a series of tasks sequentially on the unlabeled data. Two main challenges of continual learning are catastrophic forgetting and task confusion. While CSSL problem has been studied to address the catast
Externí odkaz:
http://arxiv.org/abs/2407.12188
Autor:
Yaldiz, Duygu Nur, Bakman, Yavuz Faruk, Buyukates, Baturalp, Tao, Chenyang, Ramakrishna, Anil, Dimitriadis, Dimitrios, Avestimehr, Salman
In this work, we introduce the Learnable Response Scoring Function (LARS) for Uncertainty Estimation (UE) in generative Large Language Models (LLMs). Current scoring functions for probability-based UE, such as length-normalized scoring and semantic c
Externí odkaz:
http://arxiv.org/abs/2406.11278
Recent advances in foundation models have enabled audio-generative models that produce high-fidelity sounds associated with music, events, and human actions. Despite the success achieved in modern audio-generative models, the conventional approach to
Externí odkaz:
http://arxiv.org/abs/2406.08800
Autor:
Bakman, Yavuz Faruk, Yaldiz, Duygu Nur, Buyukates, Baturalp, Tao, Chenyang, Dimitriadis, Dimitrios, Avestimehr, Salman
Generative Large Language Models (LLMs) are widely utilized for their excellence in various tasks. However, their tendency to produce inaccurate or misleading outputs poses a potential risk, particularly in high-stakes environments. Therefore, estima
Externí odkaz:
http://arxiv.org/abs/2402.11756
Many existing FL methods assume clients with fully-labeled data, while in realistic settings, clients have limited labels due to the expensive and laborious process of labeling. Limited labeled local data of the clients often leads to their local mod
Externí odkaz:
http://arxiv.org/abs/2307.08809
Autor:
Dun, Chen, Garcia, Mirian Hipolito, Zheng, Guoqing, Awadallah, Ahmed Hassan, Sim, Robert, Kyrillidis, Anastasios, Dimitriadis, Dimitrios
One of the goals in Federated Learning (FL) is to create personalized models that can adapt to the context of each participating client, while utilizing knowledge from a shared global model. Yet, often, personalization requires a fine-tuning step usi
Externí odkaz:
http://arxiv.org/abs/2306.08586
Autor:
Zhang, Tuo, Feng, Tiantian, Alam, Samiul, Dimitriadis, Dimitrios, Lee, Sunwoo, Zhang, Mi, Narayanan, Shrikanth S., Avestimehr, Salman
In this work, we propose GPT-FL, a generative pre-trained model-assisted federated learning (FL) framework. At its core, GPT-FL leverages generative pre-trained models to generate diversified synthetic data. These generated data are used to train a d
Externí odkaz:
http://arxiv.org/abs/2306.02210
In real-world machine learning systems, labels are often derived from user behaviors that the system wishes to encourage. Over time, new models must be trained as new training examples and features become available. However, feedback loops between us
Externí odkaz:
http://arxiv.org/abs/2305.14083
Autor:
Manoel, Andre, Garcia, Mirian Hipolito, Baumel, Tal, Su, Shize, Chen, Jialei, Miller, Dan, Karmon, Danny, Sim, Robert, Dimitriadis, Dimitrios
Federated Learning (FL) is a novel machine learning approach that allows the model trainer to access more data samples, by training the model across multiple decentralized data sources, while data access constraints are in place. Such trained models
Externí odkaz:
http://arxiv.org/abs/2211.09722
Asynchronous learning protocols have regained attention lately, especially in the Federated Learning (FL) setup, where slower clients can severely impede the learning process. Herein, we propose \texttt{AsyncDrop}, a novel asynchronous FL framework t
Externí odkaz:
http://arxiv.org/abs/2210.16105