Zobrazeno 1 - 7
of 7
pro vyhledávání: '"Sha, Zhizhou"'
Large Language Models (LLMs) have demonstrated remarkable capabilities across various applications, but their performance on long-context tasks is often limited by the computational complexity of attention mechanisms. This paper introduces a novel ap
Externí odkaz:
http://arxiv.org/abs/2410.10165
Previous work has demonstrated that attention mechanisms are Turing complete. More recently, it has been shown that a looped 13-layer Transformer can function as a universal programmable computer. In contrast, the multi-layer perceptrons with $\maths
Externí odkaz:
http://arxiv.org/abs/2410.09375
The computational complexity of the self-attention mechanism in popular transformer architectures poses significant challenges for training and inference, and becomes the bottleneck for long inputs. Is it possible to significantly reduce the quadrati
Externí odkaz:
http://arxiv.org/abs/2408.13233
Training data privacy is a fundamental problem in modern Artificial Intelligence (AI) applications, such as face recognition, recommendation systems, language generation, and many others, as it may contain sensitive user information related to legal
Externí odkaz:
http://arxiv.org/abs/2407.13621
We provide a two-way integration for the widely adopted ControlNet by integrating external condition generation algorithms into a single dense prediction method and incorporating its individually trained image generation processes into a single model
Externí odkaz:
http://arxiv.org/abs/2406.05871
We present TokenCompose, a Latent Diffusion Model for text-to-image generation that achieves enhanced consistency between user-specified text prompts and model-generated images. Despite its tremendous success, the standard denoising process in the La
Externí odkaz:
http://arxiv.org/abs/2312.03626
In this paper, we introduce a novel generative model, Diffusion Layout Transformers without Autoencoder (Dolfin), which significantly improves the modeling capability with reduced complexity compared to existing methods. Dolfin employs a Transformer-
Externí odkaz:
http://arxiv.org/abs/2310.16305