Zobrazeno 1 - 10
of 18 909
pro vyhledávání: '"Xinrong"'
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Large language models (LLMs), such as ChatGPT released by OpenAI, have attracted significant attention from both industry and academia due to their demonstrated ability to generate high-quality content for various tasks. Despite the impressive capabi
Externí odkaz:
http://arxiv.org/abs/2411.04704
One essential advantage of recurrent neural networks (RNNs) over transformer-based language models is their linear computational complexity concerning the sequence length, which makes them much faster in handling long sequences during inference. Howe
Externí odkaz:
http://arxiv.org/abs/2410.07145
Autor:
Xie, Xinrong, Ma, Fei, Rui, W. B., Dong, Zhaozhen, Du, Yulin, Xie, Wentao, Zhao, Y. X., Chen, Hongsheng, Gao, Fei, Xue, Haoran
Relativistic quasiparticle excitations arising from band degeneracies in crystals not only offer exciting chances to test hypotheses in particle physics but also play crucial roles in the transport and topological properties of materials and metamate
Externí odkaz:
http://arxiv.org/abs/2410.06058
Adaptation methods are developed to adapt depth foundation models to endoscopic depth estimation recently. However, such approaches typically under-perform training since they limit the parameter search to a low-rank subspace and alter the training d
Externí odkaz:
http://arxiv.org/abs/2410.00979
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Xu, Jianan, Ma, Xinrong
In this paper, we first establish two new Bailey pairs via finding two generalizations of Euler's pentagonal number theorem. Next, we specificize the Bailey lemmas with these two Bailey pairs. As applications, we finally establish some $q$-series tra
Externí odkaz:
http://arxiv.org/abs/2409.00680
Contrastive learning with the nearest neighbor has proved to be one of the most efficient self-supervised learning (SSL) techniques by utilizing the similarity of multiple instances within the same class. However, its efficacy is constrained as the n
Externí odkaz:
http://arxiv.org/abs/2408.16965
In the field of medical images, although various works find Swin Transformer has promising effectiveness on pixelwise dense prediction, whether pre-training these models without using extra dataset can further boost the performance for the downstream
Externí odkaz:
http://arxiv.org/abs/2408.05889
Multiple instance learning (MIL) problem is currently solved from either bag-classification or instance-classification perspective, both of which ignore important information contained in some instances and result in limited performance. For example,
Externí odkaz:
http://arxiv.org/abs/2408.04813