Oxford University Press (OUP), Monthly Notices of the Royal Astronomical Society, 4(493), p. 5913-5927, 2020
Full text: Unavailable
ABSTRACT Creating a data base of 21 cm brightness temperature signals from the Epoch of Reionization (EoR) for an array of reionization histories is a complex and computationally expensive task, given the range of astrophysical processes involved and the possibly high-dimensional parameter space that is to be probed. We utilize a specific type of neural network, a progressively growing generative adversarial network (PGGAN), to produce realistic tomography images of the 21 cm brightness temperature during the EoR, covering a continuous three-dimensional parameter space that models varying X-ray emissivity, Lyman band emissivity, and ratio between hard and soft X-rays. The GPU-trained network generates new samples at a resolution of ∼3 arcmin in a second (on a laptop CPU), and the resulting global 21 cm signal, power spectrum, and pixel distribution function agree well with those of the training data, taken from the 21SSD catalogue (Semelin et al.). Finally, we showcase how a trained PGGAN can be leveraged for the converse task of inferring parameters from 21 cm tomography samples via Approximate Bayesian Computation.