DiRAC 2.5 Operations 2017-2020

Lead Research Organisation: University of Cambridge
Department Name: Institute of Astronomy

Abstract

Physicists across the astronomy, nuclear and particle physics communities are focussed on understanding how the Universe works at a very fundamental level. The distance scales with which they work vary by 50 orders of magnitude from the smallest distances probed by experiments at the Large Hadron Collider, deep within the atomic nucleus, to the largest scale galaxy clusters discovered out in space. The science challenges, however, are linked through questions such as: How did the Universe begin and how is it evolving? and What are the fundamental constituents and fabric of the Universe and how do they interact?

Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.

The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.

In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:

1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:

(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.

2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.

3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.

4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.

Planned Impact

The expected impact of the DiRAC 2.5 HPC facility is fully described in the attached pathways to impact document and includes:

1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.

2) Working on co-design projects with industry partners to improve future generations of hardware and software.

3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.

4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.

5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.

6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.

Publications

10 25 50
 
Description Many new discoveries about the formation and evolution of galaxies, star formation, planet formation and particle physics theory have been made possible by the award.
Exploitation Route Many international collaborative projects are supported by the HPC resources provided by DiRAC
Sectors Aerospace

Defence and Marine

Creative Economy

Digital/Communication/Information Technologies (including Software)

Education

Healthcare

URL http://www.dirac.ac.uk
 
Title 25 hot-Jupiter properties from HST & Spitzer 
Description VizieR online Data Catalogue associated with article published in journal Astronomical Journal (AAS) with title 'Five key exoplanet questions answered via the analysis of 25 hot-Jupiter atmospheres in eclipse.' (bibcode: 2022ApJS..260....3C) 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://cdsarc.cds.unistra.fr/viz-bin/cat/J/ApJS/260/3
 
Title Bayesian evidence for the tensor-to-scalar ratio r and neutrino masses m_nu: Effects of uniform vs logarithmic priors (supplementary inference products) 
Description These are the nested sampling inference products and input files that were used to compute results for arXiv:2102.11511. Example plotting scripts (as .ipynb or as .html files) and figures from the papers are included to demonstrate usage.   Filename conventions: lcdm: Concordance cosmological model called \(\Lambda\mathrm{CDM}\) (without extension this assumes \(r=0\) and a single massive neutrino with mass \(m_\nu=0.06\,\mathrm{eV}\)). _r: \(\Lambda\mathrm{CDM}\) with variable tensor-to-scalar ratio \(r\). _nu: \(\Lambda\mathrm{CDM}\) with three massive neutrinos, sampling over the lightest neutrino mass \(m_\mathrm{light}\) and the squared mass splittings \(\delta m^2\) and \(\Delta m^2\). mcmc: Cobaya's Markov Chain Monte Carlo Metropolis sampler.https://github.com/CobayaSampler/cobaya/releases/tag/v3.0.2 pc#d###: PolyChord run with #d repeats per parameter block (where d is the number of parameters in that block) and with ### live points.https://github.com/PolyChord/PolyChordLite/releases/tag/1.17.1 _class: theory code CLASS.https://github.com/lesgourg/class_public/releases/tag/v2.9.4 _p18_TTTEEElowTE_SZ: Planck 2018 TT,TE,EE+lowl+lowE data.https://pla.esac.esa.int/pla/#cosmology _nufit50: NuFIT 5.0 data.http://www.nu-fit.org/?q=node/228 _NH and _IH: normal and inverted neutrino hierarchy. _logr##: logarithmic sampling of tensor-to-scalar ratio \(r\) with lower log bound given .by log10r=-##. _mdD: sampling over the lightest neutrino mass \(m_\mathrm{light}\) and the squared mass splittings \(\delta m^2\) and \(\Delta m^2\) (medium and heavy neutrino mass are derived parameters) with mass units in eV. _logmdD##: logarithmic (instead of uniform) sampling of the lightest neutrino mass \(m_\mathrm{light}\) with lower log bound given by log10mlight=-##.   Datasets used for the nested sampling runs: Planck 2018 TT,TE,EE+lowl+lowE: https://pla.esac.esa.int/pla/#cosmology NuFIT 5.0: http://www.nu-fit.org/?q=node/228   Software used: Cobaya: https://github.com/CobayaSampler/cobaya/releases/tag/v3.0.2 CLASS: https://github.com/lesgourg/class_public/releases/tag/v2.9.4 PolyChord: https://github.com/PolyChord/PolyChordLite/releases/tag/1.17.1 Anesthetic: https://github.com/lukashergt/anesthetic/tree/138299739544e888cc318746be087c898f1aff15   For more details see Cobaya's (https://cobaya.readthedocs.io/en/latest/index.html) and Anesthetic's (https://anesthetic.readthedocs.io/en/latest/) documentation. 
Type Of Material Database/Collection of data 
Year Produced 2021 
Provided To Others? Yes  
URL https://zenodo.org/doi/10.5281/zenodo.4556359
 
Title Data underpinning: Can Multi-Threaded Flux Tubes in Coronal Arcades Support a Magnetohydrodynamic Avalanche? 
Description Here is attached the relevant source code used for the numerical simulations undertaken in connection with the work underpinned. Here, these simulations use a numerical MHD code produced by Tony Arber, now at the University of Warwick, and others, adapted and configured in order to reflect the particular model at hand. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://risweb.st-andrews.ac.uk/portal/en/datasets/data-underpinning-can-multithreaded-flux-tubes-in...
 
Title Data underpinning: Self-consistent nanoflare heating in model active regions: MHD avalanches 
Description  
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://risweb.st-andrews.ac.uk/portal/en/datasets/data-underpinning-selfconsistent-nanoflare-heatin...
 
Title Inference products for "Finite inflation in curved space" 
Description These are the MCMC and nested sampling inference products and input files that were used to compute results for the paper "Finite inflation in cuved space" by L. T. Hergt, F. J. Agocs, W. J. Handley, M. P. Hobson, and A. N. Lasenby from 2022. Example plotting scripts (as \(\texttt{.ipynb}\) or as \(\texttt{.html}\) files) and figures from the paper are included to demonstrate usage.   We used the following python packages for the genertion of MCMC and nested sampling chains: Package Version anesthetic 2.0.0b12 classy 2.9.4 cobaya 3.0.4 GetDist 1.3.3 primpy 2.3.6 pyoscode 1.0.4 pypolychord 1.20.0                     Filename conventions: \(\texttt{mcmc}\): MCMC run \(\texttt{pcs#d####}\): PolyChord run (in synchronous mode) with \(\texttt{#d}\) repeats per parameter block (where \(\texttt{d}\) is the number of parameters in that block) and with \(\texttt{####}\) live points. \(\texttt{_cl_hf}\): Using Boltzmann theory code CLASS with nonlinearities code halofit. \(\texttt{_p18}\): Using Planck 2018 CMB data. \(\texttt{_TTTEEE}\): Using the high-l TTTEEE likelihood. \(\texttt{_TTTEEElite}\): Using the lite version of the high-l TTTEEE likelihood. \(\texttt{_lowl_lowE}\): Using the low-l likelihoods for temperature and E-modes. \(\texttt{_BK15}\): Using data from the 2015 observing season of Bicep2 and the Keck Array. \(\texttt{lcdm}\): Concordance cosmological model called LCDM (standard 6 cosmological sampling parameters, no tensor perturbations, zero spatial curvature) \(\texttt{_r}\): Extension with a variable tensor-to-scalar ratio \(r\). \(\texttt{_omegak}\): Extension with a variable curvature density parameter \(\Omega_K \). \(\texttt{_H0}\): Sampling over \(H_0\) instead of \(\theta_\mathrm{s}\). \(\texttt{_omegakh2}\): Extension with a variable curvature density parameter, but sampling over \(H_0\) instead of \(\theta_\mathrm{s}\) and over \(\omega_K\equiv\Omega_Kh^2\) instead of \(\Omega_K \). \(\texttt{_mn2}\): Using a quadratic monomial potential for the computation of the primordial universe. \(\texttt{_nat}\): Using the natural inflation potential for the computation of the primordial universe. \(\texttt{_stb}\): Using the Starobinsky potential for the computation of the primordial universe. \(\texttt{_AsfoH}\): Using the primordial sampling parameters {`logA_SR`, `N_star`, `f_i`, `omega_K`, `H0`}. \(\texttt{_perm}\): Assuming a permissive reheating scenario. 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://zenodo.org/doi/10.5281/zenodo.6547871
 
Title Renormalisation of the energy-momentum tensor in three-dimensional scalar SU(N) theories using the Wilson flow -- data release 
Description This repository contains the lattice two-point function measurements required to reproduce the results of the paper "Renormalisation of the energy-momentum tensor in three-dimensional scalar SU(N) theories using the Wilson flow" (https://arxiv.org/abs/2009.14767). The code required to perform the data analysis can be found in https://github.com/josephleekl/scalar_emt_analysis. For any questions please get in touch: joseph.lee@ed.ac.uk 
Type Of Material Database/Collection of data 
Year Produced 2020 
Provided To Others? Yes  
URL https://zenodo.org/record/4290392
 
Title Research data supporting "Finite-Temperature Effects on the X-ray Absorption Spectra of Crystalline Aluminas from First Principles" 
Description The data in this submission consists of all files generated using the plane-wave density-functional theory code, CASTEP. It contains the structure files for alpha and gamma alumina crystal structures as well as the structure files from monte-carlo sampling. In addition, it contains the electronic density of states calculations and core-hole X-Ray absorption data for alpha and gamma alumina at 0 K and 300 K. 
Type Of Material Database/Collection of data 
Year Produced 2023 
Provided To Others? Yes  
URL https://www.repository.cam.ac.uk/handle/1810/349670
 
Title Spectra and light curves presented in Double detonations: variations in Type Ia supernovae due to different core and He shell masses - II. Synthetic observables 
Description Dataset containing simulated spectra and light curves presented in the paper "Double detonations: variations in Type Ia supernovae due to different core and He shell masses - II. Synthetic observables" (ADS). 
Type Of Material Database/Collection of data 
Year Produced 2022 
Provided To Others? Yes  
URL https://zenodo.org/record/7997387