Lossless, Scalable Implicit Likelihood Inference for Cosmological Fields
Autor: | Justin Alsing, T. Lucas Makinen, Benjamin D. Wandelt, Tom Charnock |
---|---|
Přispěvatelé: | Institut d'Astrophysique de Paris (IAP), Institut national des sciences de l'Univers (INSU - CNRS)-Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS) |
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: |
Lossless compression
Physics dark matter simulations Random field Cosmology and Nongalactic Astrophysics (astro-ph.CO) Artificial neural network 010308 nuclear & particles physics Gaussian cosmological simulations Inference FOS: Physical sciences Astronomy and Astrophysics cosmological parameters from LSS Covariance power spectrum 01 natural sciences Convolutional neural network symbols.namesake Robustness (computer science) 0103 physical sciences symbols [PHYS.ASTR]Physics [physics]/Astrophysics [astro-ph] 010303 astronomy & astrophysics Algorithm Astrophysics - Cosmology and Nongalactic Astrophysics |
Zdroj: | JCAP JCAP, 2021, 11 (11), pp.049. ⟨10.1088/1475-7516/2021/11/049⟩ |
DOI: | 10.1088/1475-7516/2021/11/049⟩ |
Popis: | We present a comparison of simulation-based inference to full, field-based analytical inference in cosmological data analysis. To do so, we explore parameter inference for two cases where the information content is calculable analytically: Gaussian random fields whose covariance depends on parameters through the power spectrum; and correlated lognormal fields with cosmological power spectra. We compare two inference techniques: i) explicit field-level inference using the known likelihood and ii) implicit likelihood inference with maximally informative summary statistics compressed via Information Maximising Neural Networks (IMNNs). We find that a) summaries obtained from convolutional neural network compression do not lose information and therefore saturate the known field information content, both for the Gaussian covariance and the lognormal cases, b) simulation-based inference using these maximally informative nonlinear summaries recovers nearly losslessly the exact posteriors of field-level inference, bypassing the need to evaluate expensive likelihoods or invert covariance matrices, and c) even for this simple example, implicit, simulation-based likelihood incurs a much smaller computational cost than inference with an explicit likelihood. This work uses a new IMNNs implementation in $\texttt{Jax}$ that can take advantage of fully-differentiable simulation and inference pipeline. We also demonstrate that a single retraining of the IMNN summaries effectively achieves the theoretically maximal information, enhancing the robustness to the choice of fiducial model where the IMNN is trained. To be submitted to JCAP. We provide code and a tutorial for the analysis and relevant software at https://github.com/tlmakinen/FieldIMNNs |
Databáze: | OpenAIRE |
Externí odkaz: |