Zobrazeno 1 - 10
of 578
pro vyhledávání: '"ZHAO Xuyang"'
The scaling capability has been widely validated with respect to the number of parameters and the size of training data. One important question that is unexplored is that does scaling capability also exists similarly with respect to the number of vis
Externí odkaz:
http://arxiv.org/abs/2412.18387
In this paper, we design constant modulus probing waveforms with good correlation properties for large-scale collocated multi-input multi-output (MIMO) radar systems. The main content is as follows: First, we formulate the design problem as a fourth-
Externí odkaz:
http://arxiv.org/abs/2410.08287
We provide a statistical analysis of regularization-based continual learning on a sequence of linear regression tasks, with emphasis on how different regularization terms affect the model performance. We first derive the convergence rate for the orac
Externí odkaz:
http://arxiv.org/abs/2406.06213
With large training datasets and massive amounts of computing sources, large language models (LLMs) achieve remarkable performance in comprehensive and generative ability. Based on those powerful LLMs, the model fine-tuned with domain-specific datase
Externí odkaz:
http://arxiv.org/abs/2401.05908
Exploration systems are critical for enhancing the autonomy of robots. Due to the unpredictability of the future planning space, existing methods either adopt an inefficient greedy strategy or require a lot of resources to obtain a global solution. I
Externí odkaz:
http://arxiv.org/abs/2307.02852
Autor:
Zhang, Bing, Zhao, Xuyang, Nie, Jiangtian, Tang, Jianhang, Chen, Yuling, Zhang, Yang, Niyato, Dusit
Most recent surveys and reviews on Influential Node Ranking Methods (INRMs) hightlight discussions on the methods' technical details, but there still lacks in-depth research on the fundamental issue of how to verify the considerable influence of thes
Externí odkaz:
http://arxiv.org/abs/2303.12588
Self-Supervised Learning (SSL) is a paradigm that leverages unlabeled data for model training. Empirical studies show that SSL can achieve promising performance in distribution shift scenarios, where the downstream and training distributions differ.
Externí odkaz:
http://arxiv.org/abs/2303.01092
Real-world large-scale datasets are both noisily labeled and class-imbalanced. The issues seriously hurt the generalization of trained models. It is hence significant to address the simultaneous incorrect labeling and class-imbalance, i.e., the probl
Externí odkaz:
http://arxiv.org/abs/2211.10955
Federated learning, where algorithms are trained across multiple decentralized devices without sharing local data, is increasingly popular in distributed machine learning practice. Typically, a graph structure $G$ exists behind local devices for comm
Externí odkaz:
http://arxiv.org/abs/2209.08737
Publikováno v:
In Molecular Catalysis December 2024 569