Zobrazeno 1 - 10
of 472
pro vyhledávání: '"Wang, Junxiao"'
Autor:
Pei, Jiahuan, Viola, Irene, Huang, Haochen, Wang, Junxiao, Ahsan, Moonisa, Ye, Fanghua, Yiming, Jiang, Sai, Yao, Wang, Di, Chen, Zhumin, Ren, Pengjie, Cesar, Pablo
Autonomous artificial intelligence (AI) agents have emerged as promising protocols for automatically understanding the language-based environment, particularly with the exponential development of large language models (LLMs). However, a fine-grained,
Externí odkaz:
http://arxiv.org/abs/2405.13034
For a del Pezzo surface of degree $\geq 3$, we compute the oscillatory integral for its mirror Landau-Ginzburg model in the sense of Gross-Hacking-Keel [Mark Gross, Paul Hacking, and Sean Keel, "Mirror symmetry for log Calabi-Yau surfaces I". In: Pub
Externí odkaz:
http://arxiv.org/abs/2309.02154
As Federated Learning (FL) has gained increasing attention, it has become widely acknowledged that straightforwardly applying stochastic gradient descent (SGD) on the overall framework when learning over a sequence of tasks results in the phenomenon
Externí odkaz:
http://arxiv.org/abs/2306.01431
Online Class-Incremental (OCI) learning has sparked new approaches to expand the previously trained model knowledge from sequentially arriving data streams with new classes. Unfortunately, OCI learning can suffer from catastrophic forgetting (CF) as
Externí odkaz:
http://arxiv.org/abs/2303.07864
Federated Learning (FL) is an emerging paradigm that enables distributed users to collaboratively and iteratively train machine learning models without sharing their private data. Motivated by the effectiveness and robustness of self-attention-based
Externí odkaz:
http://arxiv.org/abs/2211.08025
Multimodal learning (MML) aims to jointly exploit the common priors of different modalities to compensate for their inherent limitations. However, existing MML methods often optimize a uniform objective for different modalities, leading to the notori
Externí odkaz:
http://arxiv.org/abs/2211.07089
Autor:
Wu, Leijie, Guo, Song, Ding, Yaohong, Wang, Junxiao, Xu, Wenchao, Xu, Richard Yida, Zhang, Jie
Self-attention mechanisms, especially multi-head self-attention (MSA), have achieved great success in many fields such as computer vision and natural language processing. However, many existing vision transformer (ViT) works simply inherent transform
Externí odkaz:
http://arxiv.org/abs/2211.08543
Quick global aggregation of effective distributed parameters is crucial to federated learning (FL), which requires adequate bandwidth for parameters communication and sufficient user data for local training. Otherwise, FL may cost excessive training
Externí odkaz:
http://arxiv.org/abs/2208.11625
Recent studies have shown that the training samples can be recovered from gradients, which are called Gradient Inversion (GradInv) attacks. However, there remains a lack of extensive surveys covering recent advances and thorough analysis of this issu
Externí odkaz:
http://arxiv.org/abs/2206.07284
In this paper, an improved multi-step finite control set model predictive current control (FCS-MPCC) strategy with speed loop disturbance compensation is proposed for permanent magnet synchronous machine (PMSM) drives system. A multi-step prediction
Externí odkaz:
http://arxiv.org/abs/2205.07213