Zobrazeno 1 - 10
of 63
pro vyhledávání: '"Zhang, Duzhen"'
Autor:
Dong, Jiahua, Liang, Wenqi, Li, Hongliu, Zhang, Duzhen, Cao, Meng, Ding, Henghui, Khan, Salman, Khan, Fahad Shahbaz
Custom diffusion models (CDMs) have attracted widespread attention due to their astonishing generative ability for personalized concepts. However, most existing CDMs unreasonably assume that personalized concepts are fixed and cannot change over time
Externí odkaz:
http://arxiv.org/abs/2410.17594
Audio editing involves the arbitrary manipulation of audio content through precise control. Although text-guided diffusion models have made significant advancements in text-to-audio generation, they still face challenges in finding a flexible and pre
Externí odkaz:
http://arxiv.org/abs/2406.04350
The success of Deep Reinforcement Learning (DRL) is largely attributed to utilizing Artificial Neural Networks (ANNs) as function approximators. Recent advances in neuroscience have unveiled that the human brain achieves efficient reward-based learni
Externí odkaz:
http://arxiv.org/abs/2403.20163
Energy-efficient spikformer has been proposed by integrating the biologically plausible spiking neural network (SNN) and artificial Transformer, whereby the Spiking Self-Attention (SSA) is used to achieve both higher accuracy and lower computational
Externí odkaz:
http://arxiv.org/abs/2403.18228
In the past year, MultiModal Large Language Models (MM-LLMs) have undergone substantial advancements, augmenting off-the-shelf LLMs to support MM inputs or outputs via cost-effective training strategies. The resulting models not only preserve the inh
Externí odkaz:
http://arxiv.org/abs/2401.13601
Continual Named Entity Recognition (CNER) is a burgeoning area, which involves updating an existing model by incorporating new entity types sequentially. Nevertheless, continual learning approaches are often severely afflicted by catastrophic forgett
Externí odkaz:
http://arxiv.org/abs/2310.14541
Neural ordinary differential equations (ODEs) are widely recognized as the standard for modeling physical mechanisms, which help to perform approximate inference in unknown physical or biological environments. In partially observable (PO) environment
Externí odkaz:
http://arxiv.org/abs/2309.14078
Incremental Named Entity Recognition (INER) involves the sequential learning of new entity types without accessing the training data of previously learned types. However, INER faces the challenge of catastrophic forgetting specific for incremental le
Externí odkaz:
http://arxiv.org/abs/2308.08793
By integrating the self-attention capability and the biological properties of Spiking Neural Networks (SNNs), Spikformer applies the flourishing Transformer architecture to SNNs design. It introduces a Spiking Self-Attention (SSA) module to mix spars
Externí odkaz:
http://arxiv.org/abs/2308.02557
Federated learning-based semantic segmentation (FSS) has drawn widespread attention via decentralized training on local clients. However, most FSS models assume categories are fixed in advance, thus heavily undergoing forgetting on old categories in
Externí odkaz:
http://arxiv.org/abs/2304.04620