Zobrazeno 1 - 10
of 18
pro vyhledávání: '"Zimerman, Itamar"'
Autor:
Zimerman, Itamar, Adir, Allon, Aharoni, Ehud, Avitan, Matan, Baruch, Moran, Drucker, Nir, Lerner, Jenny, Masalha, Ramy, Meiri, Reut, Soceanu, Omri
Modern cryptographic methods for implementing privacy-preserving LLMs such as Homomorphic Encryption (HE) require the LLMs to have a polynomial form. Forming such a representation is challenging because Transformers include non-polynomial components,
Externí odkaz:
http://arxiv.org/abs/2410.09457
Autor:
Ben-Kish, Assaf, Zimerman, Itamar, Abu-Hussein, Shady, Cohen, Nadav, Globerson, Amir, Wolf, Lior, Giryes, Raja
Long-range sequence processing poses a significant challenge for Transformers due to their quadratic complexity in input length. A promising alternative is Mamba, which demonstrates high performance and achieves Transformer-level capabilities while r
Externí odkaz:
http://arxiv.org/abs/2406.14528
Recent advances in efficient sequence modeling have led to attention-free layers, such as Mamba, RWKV, and various gated RNNs, all featuring sub-quadratic complexity in sequence length and excellent scaling properties, enabling the construction of a
Externí odkaz:
http://arxiv.org/abs/2405.16504
The Mamba layer offers an efficient selective state space model (SSM) that is highly effective in modeling multiple domains, including NLP, long-range sequence processing, and computer vision. Selective SSMs are viewed as dual models, in which one tr
Externí odkaz:
http://arxiv.org/abs/2403.01590
Designing privacy-preserving deep learning models is a major challenge within the deep learning community. Homomorphic Encryption (HE) has emerged as one of the most promising approaches in this realm, enabling the decoupling of knowledge between the
Externí odkaz:
http://arxiv.org/abs/2311.08610
Autor:
Zimerman, Itamar, Wolf, Lior
In recent years, Vision Transformers have attracted increasing interest from computer vision researchers. However, the advantage of these transformers over CNNs is only fully manifested when trained over a large dataset, mainly due to the reduced ind
Externí odkaz:
http://arxiv.org/abs/2309.13600
Autor:
Drucker, Nir, Zimerman, Itamar
Homomorphic Encryption (HE) is a cryptographic tool that allows performing computation under encryption, which is used by many privacy-preserving machine learning solutions, for example, to perform secure classification. Modern deep learning applicat
Externí odkaz:
http://arxiv.org/abs/2306.06736
A central objective in computer vision is to design models with appropriate 2-D inductive bias. Desiderata for 2D inductive bias include two-dimensional position awareness, dynamic spatial locality, and translation and permutation invariance. To addr
Externí odkaz:
http://arxiv.org/abs/2306.06635
Recently, sequence learning methods have been applied to the problem of off-policy Reinforcement Learning, including the seminal work on Decision Transformers, which employs transformers for this task. Since transformers are parameter-heavy, cannot b
Externí odkaz:
http://arxiv.org/abs/2306.05167
We present a new layer in which dynamic (i.e.,input-dependent) Infinite Impulse Response (IIR) filters of order two are used to process the input sequence prior to applying conventional attention. The input is split into chunks, and the coefficients
Externí odkaz:
http://arxiv.org/abs/2305.14952