Zobrazeno 1 - 10
of 115
pro vyhledávání: '"Xiao, Yuanzhang"'
Gradient compression has surfaced as a key technique to address the challenge of communication efficiency in distributed learning. In distributed deep learning, however, it is observed that gradient distributions are heavy-tailed, with outliers signi
Externí odkaz:
http://arxiv.org/abs/2402.01798
To address the communication bottleneck challenge in distributed learning, our work introduces a novel two-stage quantization strategy designed to enhance the communication efficiency of distributed Stochastic Gradient Descent (SGD). The proposed met
Externí odkaz:
http://arxiv.org/abs/2402.01160
Autor:
Luo, Sichun, Yao, Yuxuan, He, Bowei, Huang, Yinya, Zhou, Aojun, Zhang, Xinyi, Xiao, Yuanzhang, Zhan, Mingjie, Song, Linqi
Conventional recommendation methods have achieved notable advancements by harnessing collaborative or sequential information from user behavior. Recently, large language models (LLMs) have gained prominence for their capabilities in understanding and
Externí odkaz:
http://arxiv.org/abs/2401.13870
Autor:
Luo, Sichun, He, Bowei, Zhao, Haohan, Shao, Wei, Qi, Yanlin, Huang, Yinya, Zhou, Aojun, Yao, Yuxuan, Li, Zongpeng, Xiao, Yuanzhang, Zhan, Mingjie, Song, Linqi
Large Language Models (LLMs) have demonstrated remarkable capabilities and have been extensively deployed across various domains, including recommender systems. Prior research has employed specialized \textit{prompts} to leverage the in-context learn
Externí odkaz:
http://arxiv.org/abs/2312.16018
In this paper, an unsupervised deep learning framework based on dual-path model-driven variational auto-encoders (VAE) is proposed for angle-of-arrivals (AoAs) and channel estimation in massive MIMO systems. Specifically designed for channel estimati
Externí odkaz:
http://arxiv.org/abs/2305.18744
Federated recommendation systems employ federated learning techniques to safeguard user privacy by transmitting model parameters instead of raw user data between user devices and the central server. Nevertheless, the current federated recommender sys
Externí odkaz:
http://arxiv.org/abs/2305.06622
Distributed stochastic gradient descent (SGD) with gradient compression has become a popular communication-efficient solution for accelerating distributed learning. One commonly used method for gradient compression is Top-K sparsification, which spar
Externí odkaz:
http://arxiv.org/abs/2210.13532
Federated recommendations leverage the federated learning (FL) techniques to make privacy-preserving recommendations. Though recent success in the federated recommender system, several vital challenges remain to be addressed: (i) The majority of fede
Externí odkaz:
http://arxiv.org/abs/2208.10692
The recent popularity of edge devices and Artificial Intelligent of Things (AIoT) has driven a new wave of contextual recommendations, such as location based Point of Interest (PoI) recommendations and computing resource-aware mobile app recommendati
Externí odkaz:
http://arxiv.org/abs/2208.09586
Federated recommendation applies federated learning techniques in recommendation systems to help protect user privacy by exchanging models instead of raw user data between user devices and the central server. Due to the heterogeneity in user's attrib
Externí odkaz:
http://arxiv.org/abs/2208.09375