Zobrazeno 1 - 10
of 122 699
pro vyhledávání: '"Or, Omer"'
We investigate experimentally the undirected open microwave network $\Gamma $ with internal absorption composed of two coupled directed halves, unidirectional networks $\Gamma_{+} $ and $\Gamma_{-} $, corresponding to two possible directions of motio
Externí odkaz:
http://arxiv.org/abs/2409.03493
Autor:
Gul, Mustafa Omer, Artzi, Yoav
Systems with both language comprehension and generation capabilities can benefit from the tight connection between the two. This work studies coupling comprehension and generation with focus on continually learning from interaction with users. We pro
Externí odkaz:
http://arxiv.org/abs/2408.15992
Mesh-free Lagrangian methods are widely used for simulating fluids, solids, and their complex interactions due to their ability to handle large deformations and topological changes. These physics simulators, however, require substantial computational
Externí odkaz:
http://arxiv.org/abs/2408.15753
Autor:
Safavi-Naini, Seyed Amir Ahmad, Ali, Shuhaib, Shahab, Omer, Shahhoseini, Zahra, Savage, Thomas, Rafiee, Sara, Samaan, Jamil S, Shabeeb, Reem Al, Ladak, Farah, Yang, Jamie O, Echavarria, Juan, Babar, Sumbal, Shaukat, Aasma, Margolis, Samuel, Tatonetti, Nicholas P, Nadkarni, Girish, Kurdi, Bara El, Soroush, Ali
Background and Aims: This study evaluates the medical reasoning performance of large language models (LLMs) and vision language models (VLMs) in gastroenterology. Methods: We used 300 gastroenterology board exam-style multiple-choice questions, 138 o
Externí odkaz:
http://arxiv.org/abs/2409.00084
Autor:
Jamba Team, Lenz, Barak, Arazi, Alan, Bergman, Amir, Manevich, Avshalom, Peleg, Barak, Aviram, Ben, Almagor, Chen, Fridman, Clara, Padnos, Dan, Gissin, Daniel, Jannai, Daniel, Muhlgay, Dor, Zimberg, Dor, Gerber, Edden M, Dolev, Elad, Krakovsky, Eran, Safahi, Erez, Schwartz, Erez, Cohen, Gal, Shachaf, Gal, Rozenblum, Haim, Bata, Hofit, Blass, Ido, Magar, Inbal, Dalmedigos, Itay, Osin, Jhonathan, Fadlon, Julie, Rozman, Maria, Danos, Matan, Gokhman, Michael, Zusman, Mor, Gidron, Naama, Ratner, Nir, Gat, Noam, Rozen, Noam, Fried, Oded, Leshno, Ohad, Antverg, Omer, Abend, Omri, Lieber, Opher, Dagan, Or, Cohavi, Orit, Alon, Raz, Belson, Ro'i, Cohen, Roi, Gilad, Rom, Glozman, Roman, Lev, Shahar, Meirom, Shaked, Delbari, Tal, Ness, Tal, Asida, Tomer, Gal, Tom Ben, Braude, Tom, Pumerantz, Uriya, Cohen, Yehoshua, Belinkov, Yonatan, Globerson, Yuval, Levy, Yuval Peleg, Shoham, Yoav
We present Jamba-1.5, new instruction-tuned large language models based on our Jamba architecture. Jamba is a hybrid Transformer-Mamba mixture of experts architecture, providing high throughput and low memory usage across context lengths, while retai
Externí odkaz:
http://arxiv.org/abs/2408.12570
Autor:
Shum, KaShun, Xu, Minrui, Zhang, Jianshu, Chen, Zixin, Diao, Shizhe, Dong, Hanze, Zhang, Jipeng, Raza, Muhammad Omer
Large language models (LLMs) have become increasingly prevalent in our daily lives, leading to an expectation for LLMs to be trustworthy -- - both accurate and well-calibrated (the prediction confidence should align with its ground truth correctness
Externí odkaz:
http://arxiv.org/abs/2408.12168
Autor:
Zhou, Chunting, Yu, Lili, Babu, Arun, Tirumala, Kushal, Yasunaga, Michihiro, Shamis, Leonid, Kahn, Jacob, Ma, Xuezhe, Zettlemoyer, Luke, Levy, Omer
We introduce Transfusion, a recipe for training a multi-modal model over discrete and continuous data. Transfusion combines the language modeling loss function (next token prediction) with diffusion to train a single transformer over mixed-modality s
Externí odkaz:
http://arxiv.org/abs/2408.11039
We study the steady-state properties of quantum channels with local Kraus operators. We consider a large family that consists of general ergodic 1-local (non-interacting) terms and general 2-local (interacting) terms. Physically, a repeated applicati
Externí odkaz:
http://arxiv.org/abs/2408.08672
The Transformer architecture has revolutionized deep learning through its Self-Attention mechanism, which effectively captures contextual information. However, the memory footprint of Self-Attention presents significant challenges for long-sequence t
Externí odkaz:
http://arxiv.org/abs/2408.08454
As connected and autonomous vehicles proliferate, the Controller Area Network (CAN) bus has become the predominant communication standard for in-vehicle networks due to its speed and efficiency. However, the CAN bus lacks basic security measures such
Externí odkaz:
http://arxiv.org/abs/2408.08433