Zobrazeno 1 - 10
of 16 618
pro vyhledávání: '"A. Barret"'
Context. Inferring spectral parameters from X-ray data is one of the cornerstones of high-energy astrophysics, and is achieved using software stacks that have been developed over the last twenty years and more. However, as models get more complex and
Externí odkaz:
http://arxiv.org/abs/2409.05757
Field-Effect Transistors with graphene channels or GFETs are an interesting alternative for the detection of analytes in biological fluids since the electrical behavior of the channel changes when exposed to a sample (among other detection strategies
Externí odkaz:
http://arxiv.org/abs/2407.09656
Autor:
Xu, Yerong, Pinto, Ciro, Rogantini, Daniele, Barret, Didier, Bianchi, Stefano, Guainazzi, Matteo, Ebrero, Jacobo, Alston, William, Kara, Erin, Cusumano, Giancarlo
Publikováno v:
A&A 687, A179 (2024)
The extreme velocities and high ionization states of ultra-fast outflows (UFOs) make them a promising candidate for AGN feedback on the evolution of the host galaxy. However, their exact underlying driving mechanism is not yet fully understood. Given
Externí odkaz:
http://arxiv.org/abs/2405.07494
Autor:
Barret, Didier, Albouys, Vincent, Knödlseder, Jürgen, Loizillon, Xavier, D'Andrea, Matteo, Ardellier, Florence, Bandler, Simon, Dieleman, Pieter, Duband, Lionel, Dubbeldam, Luc, Macculi, Claudio, Medinaceli, Eduardo, Pajot, Francois, Prêle, Damien, Ravera, Laurent, Thibert, Tanguy, Trallero, Isabel Vera, Webb, Natalie
The X-ray Integral Field Unit (X-IFU) is the high-resolution X-ray spectrometer to fly on board the Athena Space Observatory of the European Space Agency (ESA). It is being developed by an international Consortium led by France, involving twelve ESA
Externí odkaz:
http://arxiv.org/abs/2404.15122
This study introduces novel concepts in the analysis of limit order books (LOBs) with a focus on unveiling strategic insights into spread prediction and understanding the global mid-price (GMP) phenomenon. We define and analyze the total market order
Externí odkaz:
http://arxiv.org/abs/2404.11722
Autor:
Barret, Didier, Dupourqué, Simon
Neural networks are being extensively used for modelling data, especially in the case where no likelihood can be formulated. Although in the case of X-ray spectral fitting, the likelihood is known, we aim to investigate the neural networks ability to
Externí odkaz:
http://arxiv.org/abs/2401.06061
Autor:
Leboulleux, Lucie, Cantalloube, Faustine, Foujols, Marie-Alice, Giard, Martin, Guilet, Jérôme, Knödlseder, Jürgen, Santerne, Alexandre, Todorov, Lilia, Barret, Didier, Berne, Olivier, Crida, Aurélien, Hennebelle, Patrick, Kral, Quentin, Lagadec, Eric, Malbet, Fabien, Milli, Julien, N'Diaye, Mamadou, Roques, Françoise
Publikováno v:
SF2A proceedings 2023
To keep current global warming below 1.5{\deg}C compared with the pre-industrial era, measures must be taken as quickly as possible in all spheres of society. Astronomy must also make its contribution. In this proceeding, and during the workshop to w
Externí odkaz:
http://arxiv.org/abs/2311.13625
Autor:
Neveu, Jérémy, Brémaud, Vincent, Antilogus, Pierre, Barret, Florent, Bongard, Sébastien, Copin, Yannick, Dagoret-Campagne, Sylvie, Juramy, Claire, Le-Guillou, Laurent, Moniez, Marc, Sepulveda, Eduardo, Collaboration, The LSST Dark Energy Science
Publikováno v:
A&A, 684, A21 (2024)
In the next decade, many optical surveys will aim to tackle the question of dark energy nature, measuring its equation of state parameter at the permil level. This requires trusting the photometric calibration of the survey with a precision never rea
Externí odkaz:
http://arxiv.org/abs/2307.04898
Autor:
Shen, Sheng, Hou, Le, Zhou, Yanqi, Du, Nan, Longpre, Shayne, Wei, Jason, Chung, Hyung Won, Zoph, Barret, Fedus, William, Chen, Xinyun, Vu, Tu, Wu, Yuexin, Chen, Wuyang, Webson, Albert, Li, Yunxuan, Zhao, Vincent, Yu, Hongkun, Keutzer, Kurt, Darrell, Trevor, Zhou, Denny
Sparse Mixture-of-Experts (MoE) is a neural architecture design that can be utilized to add learnable parameters to Large Language Models (LLMs) without increasing inference cost. Instruction tuning is a technique for training LLMs to follow instruct
Externí odkaz:
http://arxiv.org/abs/2305.14705
Autor:
Longpre, Shayne, Yauney, Gregory, Reif, Emily, Lee, Katherine, Roberts, Adam, Zoph, Barret, Zhou, Denny, Wei, Jason, Robinson, Kevin, Mimno, David, Ippolito, Daphne
Pretraining is the preliminary and fundamental step in developing capable language models (LM). Despite this, pretraining data design is critically under-documented and often guided by empirically unsupported intuitions. To address this, we pretrain
Externí odkaz:
http://arxiv.org/abs/2305.13169