Published in

Oxford University Press (OUP), Monthly Notices of the Royal Astronomical Society, 4(493), p. 5913-5927, 2020

DOI: 10.1093/mnras/staa523

Links

Tools

Export citation

Search in Google Scholar

A unified framework for 21 cm tomography sample generation and parameter inference with progressively growing GANs

Journal article published in 2020 by Florian List, Geraint F. Lewis ORCID
This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

ABSTRACT Creating a data base of 21 cm brightness temperature signals from the Epoch of Reionization (EoR) for an array of reionization histories is a complex and computationally expensive task, given the range of astrophysical processes involved and the possibly high-dimensional parameter space that is to be probed. We utilize a specific type of neural network, a progressively growing generative adversarial network (PGGAN), to produce realistic tomography images of the 21 cm brightness temperature during the EoR, covering a continuous three-dimensional parameter space that models varying X-ray emissivity, Lyman band emissivity, and ratio between hard and soft X-rays. The GPU-trained network generates new samples at a resolution of ∼3 arcmin in a second (on a laptop CPU), and the resulting global 21 cm signal, power spectrum, and pixel distribution function agree well with those of the training data, taken from the 21SSD catalogue (Semelin et al.). Finally, we showcase how a trained PGGAN can be leveraged for the converse task of inferring parameters from 21 cm tomography samples via Approximate Bayesian Computation.

Beta version