First author: J. Adamek
The measurement of the absolute neutrino mass scale from cosmological large-scale clustering data is one of the key science goals of the Euclid mission. Such a measurement relies on precise modelling of the impact of neutrinos on structure formation, which can be studied with $N$-body simulations. Here we present the results from a major code comparison effort to establish the maturity and reliability of numerical methods for treating massive neutrinos.
First author: Isaac Tutusaus
Euclid will observe 15 000 deg$^2$ of the darkest sky, in regions free of contamination by light from our Galaxy and our Solar System. Three “Euclid Deep Fields” surveys covering around 40 deg$^2$ in total will extend the scientific scope of the mission to the high-redshift Universe. The complete survey will be constituted by hundreds of thousands of images and several tens of petabytes of data.
First author: Chethan Krishnan
We show that the Friedmann-Lema^{i}tre-Robertson-Walker (FLRW) framework has an instability towards the growth of fluid flow anisotropies, even if the Universe is accelerating. This flow (tilt) instability in the matter sector is invisible to Cosmic No-Hair Theorem-like arguments, which typically only flag shear anisotropies in the metric. We illustrate our claims in the setting of ``dipole cosmology’’, the maximally Copernican generalization of FLRW that can accommodate a flow.
First author: Atsuhisa Ota
The nonlinear Lagrangian displacement field and initial linear density field are highly correlated. Therefore, reconstructing the nonlinear displacement field could better extract the primordial cosmological information from the late time density field. Continuing from Ref. [1], we investigate to what extent the iterative displacement reconstruction in Ref. [2] can recover the true displacement field with a particular emphasis on improving the numerical discreteness effect and improving the perturbation theory model for the postreconstructed field.
First author: Susana J. Landau
A discrete space-time structure lying at about the Planck scale may become manifest in the form of very small violations of the conservation of the matter energy-momentum tensor. In order to include such kind of violations, forbidden within the General Relativity framework, the theory of unimodular gravity seems as the simplest option to describe the gravitational interaction. In the cosmological context, a direct consequence of such violation of energy conservation might be heuristically viewed a “diffusion process of matter (both dark and ordinary)” into an effective dark energy term in Einstein’s equations, which leads under natural assumptions to an adequate estimate for the value of the cosmological constant.
First author: Joseph Ryan
The planning and design of future experiments rely heavily on forecasting to assess the potential scientific value provided by a hypothetical set of measurements. The Fisher information matrix, due to its convenient properties and low computational cost, provides an especially useful forecasting tool. However, the Fisher matrix only provides a reasonable approximation to the true likelihood when data are nearly Gaussian distributed and observables have nearly linear dependence on the parameters of interest.
First author: Ngai Pok Kwan
In the light of GPU accelerations, sequential operations such as solving ordinary differential equations can be bottlenecks for gradient evaluations and hinder potential speed gains. In this work, we focus on growth functions and their time derivatives in cosmological particle mesh simulations and show that these are the majority time cost when using gradient based inference algorithms. We propose to construct novel conditional B-spline emulators which directly learn an interpolating function for the growth factor as a function of time, conditioned on the cosmology.
First author: Carlos Mauricio Correa
Cosmic voids are promising cosmological laboratories for studying the dark energy phenomenon and alternative gravity theories. They are receiving special attention nowadays in view of the new generation of galaxy spectroscopic surveys, which are covering an unprecedented volume and redshift range. There are two primary statistics in void studies: (i) the void size function, which characterises the abundance of voids, and (ii) the void-galaxy cross-correlation function, which contains information about the density and velocity fields in these regions.
First author: Honggeun Kim
Detecting cosmological signals from the Epoch of Reionization (EoR) requires high-precision calibration to isolate the cosmological signals from foreground emission. In radio interferometery, perturbed primary beams of antenna elements can disrupt the precise calibration, which results in contaminating the foreground-free region, or the EoR window, in the cylindrically averaged power spectrum. For Hydrogen Epoch of Reionization Array (HERA), we simulate and characterize the perturbed primary beams induced by feed motions such as axial, lateral, and tilting motions, above the 14-meter dish.
First author: Patrick C. Breysse
We introduce a novel unbiased, cross-correlation estimator for the one-point statistics of cosmological random fields. One-point statistics are a useful tool for analysis of highly non-Gaussian density fields, while cross-correlations provide a powerful method for combining information from pairs of fields and separating them from noise and systematics. We derive a new Deconvolved Distribution Estimator that combines the useful properties of these two methods into one statistic.