Zobrazeno 1 - 10
of 10 675
pro vyhledávání: '"A P, Mullins"'
It is now a common business practice to buy access to large language model (LLM) inference rather than self-host, because of significant upfront hardware infrastructure and energy costs. However, as a buyer, there is no mechanism to verify the authen
Externí odkaz:
http://arxiv.org/abs/2411.05197
Autor:
Khachaturov, David, Mullins, Robert
Quantifying robustness in a single measure for the purposes of model selection, development of adversarial training methods, and anticipating trends has so far been elusive. The simplest metric to consider is the number of trainable parameters in a m
Externí odkaz:
http://arxiv.org/abs/2410.18556
Autor:
Abel, C., Ayres, N. J., Ban, G., Bison, G., Bodek, K., Bondar, V., Bouillaud, T., Bowles, D. C., Caratsch, G. L., Chanel, E., Chen, W., Chiu, P. -J., Crawford, C., Dechenaux, B., Doorenbos, C. B., Emmenegger, S., Ferraris-Bouchez, L., Fertl, M., Flaux, P., Fratangelo, A., Goupillière, D., Griffith, W. C., Höhl, D., Kasprzak, M., Kirch, K., Kletzl, V., Komposch, S. V., Koss, P. A., Krempel, J., Lauss, B., Lefort, T., Lejuez, A., Li, R., Meier, M., Menu, J., Michielsen, K., Mullan, P., Mullins, A., Naviliat-Cuncic, O., Pais, D., Piegsa, F. M., Pignol, G., Quemener, G., Rawlik, M., Rebreyend, D., Rienaecker, I., Ries, D., Roccia, S., Rozpedzik, D., Schnabel, A., Schmidt-Wellenburg, P., Segarra, E. P., Severijns, N., Smith, C. A., Svirina, K., Tavakoli, R., Thorne, J., Touati, S., Vankeirsbilck, J., Virot, R., Voigt, J., Wursten, E., Yazdandoost, N., Zejma, J., Ziehl, N., Zsigmond, G.
We present a coil system designed to generate a highly uniform magnetic field for the n2EDM experiment at the Paul Scherrer Institute. It consists of a main $B_0$ coil and a set of auxiliary coils mounted on a cubic structure with a side length of 27
Externí odkaz:
http://arxiv.org/abs/2410.07914
Autor:
Augeri, Christopher James, Mullins, Barry E., Baird III, Leemon C., Bulutoglu, Dursun A., Baldwin, Rusty O.
Publikováno v:
Proceedings of the 2007 workshop on Experimental Computer Science (ExpCS) at ACM FCRC 2007
XML simplifies data exchange among heterogeneous computers, but it is notoriously verbose and has spawned the development of many XML-specific compressors and binary formats. We present an XML test corpus and a combined efficiency metric integrating
Externí odkaz:
http://arxiv.org/abs/2410.07603
Autor:
Backman, Spencer, Charbonneau, Cole, Loehr, Nicholas A., Mullins, Patrick, O'Connor, Mazie, Warrington, Gregory S.
For $0\leq k\leq n-1$, we introduce a family of $k$-skeletal paths which are counted by the $n$-th Catalan number for each $k$, and specialize to Dyck paths when $k=n-1$. We similarly introduce $k$-skeletal parking functions which are equinumerous wi
Externí odkaz:
http://arxiv.org/abs/2408.06923
Autor:
Funke, Lars, Ilchen, Markus, Dingel, Kristina, Mazza, Tommaso, Mullins, Terence, Otto, Thorsten, Rivas, Daniel, Savio, Sara, Serkez, Svitozar, Walter, Peter, Wieland, Niclas, Wülfing, Lasse, Bari, Sadia, Boll, Rebecca, Braune, Markus, Calegari, Francesca, De Fanis, Alberto, Decking, Winfried, Duensing, Andreas, Düsterer, Stefan, Ehresmann, Arno, Erk, Benjamin, de Lima, Danilo Enoque Ferreira, Galler, Andreas, Geloni, Gianluca, Grünert, Jan, Guetg, Marc, Grychtol, Patrik, Hans, Andreas, Held, Arne, Hindriksson, Ruda, Inhester, Ludger, Jahnke, Till, Laksman, Joakim, Larsson, Mats, Liu, Jia, Marangos, Jon P., Marder, Lutz, Meier, David, Meyer, Michael, Mirian, Najmeh, Ott, Christian, Passow, Christopher, Pfeifer, Thomas, Rupprecht, Patrick, Schletter, Albert, Schmidt, Philipp, Scholz, Frank, Schott, Simon, Schneidmiller, Evgeny, Sick, Bernhard, Son, Sang-Kil, Tiedtke, Kai, Usenko, Sergey, Wanie, Vincent, Wurzer, Markus, Yurkov, Mikhail, Zhaunerchyk, Vitali, Helml, Wolfram
Attosecond X-ray pulses are the key to studying electron dynamics at their natural time scale involving specific electronic states. They are promising to build the conceptual bridge between physical and chemical photo-reaction processes. Free-electro
Externí odkaz:
http://arxiv.org/abs/2408.03858
Autor:
Gemma Team, Riviere, Morgane, Pathak, Shreya, Sessa, Pier Giuseppe, Hardin, Cassidy, Bhupatiraju, Surya, Hussenot, Léonard, Mesnard, Thomas, Shahriari, Bobak, Ramé, Alexandre, Ferret, Johan, Liu, Peter, Tafti, Pouya, Friesen, Abe, Casbon, Michelle, Ramos, Sabela, Kumar, Ravin, Lan, Charline Le, Jerome, Sammy, Tsitsulin, Anton, Vieillard, Nino, Stanczyk, Piotr, Girgin, Sertan, Momchev, Nikola, Hoffman, Matt, Thakoor, Shantanu, Grill, Jean-Bastien, Neyshabur, Behnam, Bachem, Olivier, Walton, Alanna, Severyn, Aliaksei, Parrish, Alicia, Ahmad, Aliya, Hutchison, Allen, Abdagic, Alvin, Carl, Amanda, Shen, Amy, Brock, Andy, Coenen, Andy, Laforge, Anthony, Paterson, Antonia, Bastian, Ben, Piot, Bilal, Wu, Bo, Royal, Brandon, Chen, Charlie, Kumar, Chintu, Perry, Chris, Welty, Chris, Choquette-Choo, Christopher A., Sinopalnikov, Danila, Weinberger, David, Vijaykumar, Dimple, Rogozińska, Dominika, Herbison, Dustin, Bandy, Elisa, Wang, Emma, Noland, Eric, Moreira, Erica, Senter, Evan, Eltyshev, Evgenii, Visin, Francesco, Rasskin, Gabriel, Wei, Gary, Cameron, Glenn, Martins, Gus, Hashemi, Hadi, Klimczak-Plucińska, Hanna, Batra, Harleen, Dhand, Harsh, Nardini, Ivan, Mein, Jacinda, Zhou, Jack, Svensson, James, Stanway, Jeff, Chan, Jetha, Zhou, Jin Peng, Carrasqueira, Joana, Iljazi, Joana, Becker, Jocelyn, Fernandez, Joe, van Amersfoort, Joost, Gordon, Josh, Lipschultz, Josh, Newlan, Josh, Ji, Ju-yeong, Mohamed, Kareem, Badola, Kartikeya, Black, Kat, Millican, Katie, McDonell, Keelin, Nguyen, Kelvin, Sodhia, Kiranbir, Greene, Kish, Sjoesund, Lars Lowe, Usui, Lauren, Sifre, Laurent, Heuermann, Lena, Lago, Leticia, McNealus, Lilly, Soares, Livio Baldini, Kilpatrick, Logan, Dixon, Lucas, Martins, Luciano, Reid, Machel, Singh, Manvinder, Iverson, Mark, Görner, Martin, Velloso, Mat, Wirth, Mateo, Davidow, Matt, Miller, Matt, Rahtz, Matthew, Watson, Matthew, Risdal, Meg, Kazemi, Mehran, Moynihan, Michael, Zhang, Ming, Kahng, Minsuk, Park, Minwoo, Rahman, Mofi, Khatwani, Mohit, Dao, Natalie, Bardoliwalla, Nenshad, Devanathan, Nesh, Dumai, Neta, Chauhan, Nilay, Wahltinez, Oscar, Botarda, Pankil, Barnes, Parker, Barham, Paul, Michel, Paul, Jin, Pengchong, Georgiev, Petko, Culliton, Phil, Kuppala, Pradeep, Comanescu, Ramona, Merhej, Ramona, Jana, Reena, Rokni, Reza Ardeshir, Agarwal, Rishabh, Mullins, Ryan, Saadat, Samaneh, Carthy, Sara Mc, Cogan, Sarah, Perrin, Sarah, Arnold, Sébastien M. R., Krause, Sebastian, Dai, Shengyang, Garg, Shruti, Sheth, Shruti, Ronstrom, Sue, Chan, Susan, Jordan, Timothy, Yu, Ting, Eccles, Tom, Hennigan, Tom, Kocisky, Tomas, Doshi, Tulsee, Jain, Vihan, Yadav, Vikas, Meshram, Vilobh, Dharmadhikari, Vishal, Barkley, Warren, Wei, Wei, Ye, Wenming, Han, Woohyun, Kwon, Woosuk, Xu, Xiang, Shen, Zhe, Gong, Zhitao, Wei, Zichuan, Cotruta, Victor, Kirk, Phoebe, Rao, Anand, Giang, Minh, Peran, Ludovic, Warkentin, Tris, Collins, Eli, Barral, Joelle, Ghahramani, Zoubin, Hadsell, Raia, Sculley, D., Banks, Jeanine, Dragan, Anca, Petrov, Slav, Vinyals, Oriol, Dean, Jeff, Hassabis, Demis, Kavukcuoglu, Koray, Farabet, Clement, Buchatskaya, Elena, Borgeaud, Sebastian, Fiedel, Noah, Joulin, Armand, Kenealy, Kathleen, Dadashi, Robert, Andreev, Alek
In this work, we introduce Gemma 2, a new addition to the Gemma family of lightweight, state-of-the-art open models, ranging in scale from 2 billion to 27 billion parameters. In this new version, we apply several known technical modifications to the
Externí odkaz:
http://arxiv.org/abs/2408.00118
Autor:
Zeng, Wenjun, Liu, Yuchi, Mullins, Ryan, Peran, Ludovic, Fernandez, Joe, Harkous, Hamza, Narasimhan, Karthik, Proud, Drew, Kumar, Piyush, Radharapu, Bhaktipriya, Sturman, Olivia, Wahltinez, Oscar
We present ShieldGemma, a comprehensive suite of LLM-based safety content moderation models built upon Gemma2. These models provide robust, state-of-the-art predictions of safety risks across key harm types (sexually explicit, dangerous content, hara
Externí odkaz:
http://arxiv.org/abs/2407.21772
Autor:
Zhang, Zixi, Zhang, Cheng, Gao, Xitong, Mullins, Robert D., Constantinides, George A., Zhao, Yiren
Low-rank Adaption (LoRA) has been the de-facto parameter-efficient fine-tuning technique for large language models. We present HeteroLoRA, a light-weight search algorithm that leverages zero-cost proxies to allocate the limited LoRA trainable paramet
Externí odkaz:
http://arxiv.org/abs/2406.14956
Autor:
Chen, Yuang, Zhang, Cheng, Gao, Xitong, Mullins, Robert D., Constantinides, George A., Zhao, Yiren
Grouped-query attention (GQA) has been widely adopted in LLMs to mitigate the complexity of multi-head attention (MHA). To transform an MHA to a GQA, neighbour queries in MHA are evenly split into groups where each group shares the value and key laye
Externí odkaz:
http://arxiv.org/abs/2406.14963