Zobrazeno 1 - 10
of 4 749
pro vyhledávání: '"Deng,Lei"'
The Forward-Forward (FF) algorithm was recently proposed as a local learning method to address the limitations of backpropagation (BP), offering biological plausibility along with memory-efficient and highly parallelized computational benefits. Howev
Externí odkaz:
http://arxiv.org/abs/2408.14925
The pre-training cost of large language models (LLMs) is prohibitive. One cutting-edge approach to reduce the cost is zero-shot weight transfer, also known as model growth for some cases, which magically transfers the weights trained in a small model
Externí odkaz:
http://arxiv.org/abs/2408.08681
Autor:
Tan, Dongjie, Ji, Jianghui, Bao, Chunhui, Huang, Xiumin, Chen, Guo, Wang, Su, Dong, Yao, Li, Haitao, Zhang, Junbo, Fang, Liang, Li, Dong, Deng, Lei, Liu, Jiacheng, Zhu, Zi
The Closeby Habitable Exoplanet Survey (CHES) constitutes a mission intricately designed to systematically survey approximately 100 solar-type stars located within the immediate proximity of the solar system, specifically within a range of 10 parsecs
Externí odkaz:
http://arxiv.org/abs/2408.06338
Autor:
Xu, Mingkun, Yin, Huifeng, Wu, Yujie, Li, Guoqi, Liu, Faqiang, Pei, Jing, Zhong, Shuai, Deng, Lei
In recent years, spiking neural networks (SNNs) have attracted substantial interest due to their potential to replicate the energy-efficient and event-driven processing of biological neurons. Despite this, the application of SNNs in graph representat
Externí odkaz:
http://arxiv.org/abs/2407.20508
Current Large Language Models (LLMs) face inherent limitations due to their pre-defined context lengths, which impede their capacity for multi-hop reasoning within extensive textual contexts. While existing techniques like Retrieval-Augmented Generat
Externí odkaz:
http://arxiv.org/abs/2406.12331
Increasing the size of a Transformer model does not always lead to enhanced performance. This phenomenon cannot be explained by the empirical scaling laws. Furthermore, improved generalization ability occurs as the model memorizes the training sample
Externí odkaz:
http://arxiv.org/abs/2405.08707
Graph representation learning has become a crucial task in machine learning and data mining due to its potential for modeling complex structures such as social networks, chemical compounds, and biological systems. Spiking neural networks (SNNs) have
Externí odkaz:
http://arxiv.org/abs/2403.17040
Autor:
Yin, Huifeng, Zheng, Hanle, Mao, Jiayi, Ding, Siyuan, Liu, Xing, Xu, Mingkun, Hu, Yifan, Pei, Jing, Deng, Lei
Spiking neural networks (SNNs), inspired by the neural circuits of the brain, are promising in achieving high computational efficiency with biological fidelity. Nevertheless, it is quite difficult to optimize SNNs because the functional roles of thei
Externí odkaz:
http://arxiv.org/abs/2403.16674
Diffusion models have achieved remarkable success in generating high quality image and video data. More recently, they have also been used for image compression with high perceptual quality. In this paper, we present a novel approach to extreme video
Externí odkaz:
http://arxiv.org/abs/2402.08934
Transformer-based Large Language Models (LLMs) often impose limitations on the length of the text input to ensure the generation of fluent and relevant responses. This constraint restricts their applicability in scenarios involving long texts. We pro
Externí odkaz:
http://arxiv.org/abs/2312.09571