Zobrazeno 1 - 10
of 1 030
pro vyhledávání: '"Martin, Louis"'
Autor:
Wirth-Singh, Anna, Fröch, Johannes E., Yang, Fan, Martin, Louis, Zhang, Hualiang, Tanguy, Quentin T., Zhou, Zhihao, Huang, Luocheng, John, Demis D., Stamenic, Biljana, Hu, Juejun, Gu, Tian, Majumdar, Arka
Wide field of view and light weight optics are critical for advanced eyewear, with applications in augmented/virtual reality and night vision. Conventional refractive lenses are often stacked to correct aberrations at wide field of view, leading to l
Externí odkaz:
http://arxiv.org/abs/2406.14725
Autor:
Tuan, Yi-Lin, Chen, Xilun, Smith, Eric Michael, Martin, Louis, Batra, Soumya, Celikyilmaz, Asli, Wang, William Yang, Bikel, Daniel M.
As large language models (LLMs) become easily accessible nowadays, the trade-off between safety and helpfulness can significantly impact user experience. A model that prioritizes safety will cause users to feel less engaged and assisted while priorit
Externí odkaz:
http://arxiv.org/abs/2404.01295
Autor:
Xiong, Wenhan, Liu, Jingyu, Molybog, Igor, Zhang, Hejia, Bhargava, Prajjwal, Hou, Rui, Martin, Louis, Rungta, Rashi, Sankararaman, Karthik Abinav, Oguz, Barlas, Khabsa, Madian, Fang, Han, Mehdad, Yashar, Narang, Sharan, Malik, Kshitiz, Fan, Angela, Bhosale, Shruti, Edunov, Sergey, Lewis, Mike, Wang, Sinong, Ma, Hao
We present a series of long-context LLMs that support effective context windows of up to 32,768 tokens. Our model series are built through continual pretraining from Llama 2 with longer training sequences and on a dataset where long texts are upsampl
Externí odkaz:
http://arxiv.org/abs/2309.16039
Autor:
Rozière, Baptiste, Gehring, Jonas, Gloeckle, Fabian, Sootla, Sten, Gat, Itai, Tan, Xiaoqing Ellen, Adi, Yossi, Liu, Jingyu, Sauvestre, Romain, Remez, Tal, Rapin, Jérémy, Kozhevnikov, Artyom, Evtimov, Ivan, Bitton, Joanna, Bhatt, Manish, Ferrer, Cristian Canton, Grattafiori, Aaron, Xiong, Wenhan, Défossez, Alexandre, Copet, Jade, Azhar, Faisal, Touvron, Hugo, Martin, Louis, Usunier, Nicolas, Scialom, Thomas, Synnaeve, Gabriel
We release Code Llama, a family of large language models for code based on Llama 2 providing state-of-the-art performance among open models, infilling capabilities, support for large input contexts, and zero-shot instruction following ability for pro
Externí odkaz:
http://arxiv.org/abs/2308.12950
Autor:
Touvron, Hugo, Martin, Louis, Stone, Kevin, Albert, Peter, Almahairi, Amjad, Babaei, Yasmine, Bashlykov, Nikolay, Batra, Soumya, Bhargava, Prajjwal, Bhosale, Shruti, Bikel, Dan, Blecher, Lukas, Ferrer, Cristian Canton, Chen, Moya, Cucurull, Guillem, Esiobu, David, Fernandes, Jude, Fu, Jeremy, Fu, Wenyin, Fuller, Brian, Gao, Cynthia, Goswami, Vedanuj, Goyal, Naman, Hartshorn, Anthony, Hosseini, Saghar, Hou, Rui, Inan, Hakan, Kardas, Marcin, Kerkez, Viktor, Khabsa, Madian, Kloumann, Isabel, Korenev, Artem, Koura, Punit Singh, Lachaux, Marie-Anne, Lavril, Thibaut, Lee, Jenya, Liskovich, Diana, Lu, Yinghai, Mao, Yuning, Martinet, Xavier, Mihaylov, Todor, Mishra, Pushkar, Molybog, Igor, Nie, Yixin, Poulton, Andrew, Reizenstein, Jeremy, Rungta, Rashi, Saladi, Kalyan, Schelten, Alan, Silva, Ruan, Smith, Eric Michael, Subramanian, Ranjan, Tan, Xiaoqing Ellen, Tang, Binh, Taylor, Ross, Williams, Adina, Kuan, Jian Xiang, Xu, Puxin, Yan, Zheng, Zarov, Iliyan, Zhang, Yuchen, Fan, Angela, Kambadur, Melanie, Narang, Sharan, Rodriguez, Aurelien, Stojnic, Robert, Edunov, Sergey, Scialom, Thomas
In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama 2-Chat, are optimized for dialogue use ca
Externí odkaz:
http://arxiv.org/abs/2307.09288
Autor:
Popescu, Cosmin-Constantin, Dao, Khoi Phuong, Ranno, Luigi, Mills, Brian, Martin, Louis, Zhang, Yifei, Neltner, David Bono. Brian, Gu, Tian, Hu, Juejun, Aryana, Kiumars, Humphreys, William M., Kim, Hyun Jung, Vitale, Steven, Miller, Paul, Roberts, Christopher, Geiger, Sarah, Callahan, Dennis, Moebius, Michael, Kang, Myungkoo, Richardson, Kathleen, Ocampo, Carlos A. Ríos
Owing to their unique tunable optical properties, chalcogenide phase change materials are increasingly being investigated for optics and photonics applications. However, in situ characterization of their phase transition characteristics is a capabili
Externí odkaz:
http://arxiv.org/abs/2307.06216
Autor:
Plekhanov, Mikhail, Kassner, Nora, Popat, Kashyap, Martin, Louis, Merello, Simone, Kozlovskii, Borislav, Dreyer, Frédéric A., Cancedda, Nicola
Entity Linking is one of the most common Natural Language Processing tasks in practical applications, but so far efficient end-to-end solutions with multilingual coverage have been lacking, leading to complex model stacks. To fill this gap, we releas
Externí odkaz:
http://arxiv.org/abs/2306.08896
Autor:
Atzeni, Mattia, Plekhanov, Mikhail, Dreyer, Frédéric A., Kassner, Nora, Merello, Simone, Martin, Louis, Cancedda, Nicola
Entity linking methods based on dense retrieval are an efficient and widely used solution in large-scale applications, but they fall short of the performance of generative models, as they are sensitive to the structure of the embedding space. In orde
Externí odkaz:
http://arxiv.org/abs/2305.12027
Detection and disambiguation of all entities in text is a crucial task for a wide range of applications. The typical formulation of the problem involves two stages: detect mention boundaries and link all mentions to a knowledge base. For a long time,
Externí odkaz:
http://arxiv.org/abs/2209.06148