Zobrazeno 1 - 2
of 2
pro vyhledávání: '"Moon, Seunghun"'
Beyond the Transformer, it is important to explore how to exploit the capacity of the MetaFormer, an architecture that is fundamental to the performance improvements of the Transformer. Previous studies have exploited it only for the backbone network
Externí odkaz:
http://arxiv.org/abs/2408.07576
We present an Encoder-Decoder Attention Transformer, EDAFormer, which consists of the Embedding-Free Transformer (EFT) encoder and the all-attention decoder leveraging our Embedding-Free Attention (EFA) structure. The proposed EFA is a novel global c
Externí odkaz:
http://arxiv.org/abs/2407.17261