DiRAC 2.5 Operations 2017-2020
Lead Research Organisation:
University of Cambridge
Department Name: Institute of Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed on understanding how the Universe works at a very fundamental level. The distance scales with which they work vary by 50 orders of magnitude from the smallest distances probed by experiments at the Large Hadron Collider, deep within the atomic nucleus, to the largest scale galaxy clusters discovered out in space. The science challenges, however, are linked through questions such as: How did the Universe begin and how is it evolving? and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Planned Impact
The expected impact of the DiRAC 2.5 HPC facility is fully described in the attached pathways to impact document and includes:
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
Organisations
Publications
Hardy F
(2023)
Estimating nosocomial infection and its outcomes in hospital patients in England with a diagnosis of COVID-19 using machine learning
in International Journal of Data Science and Analytics
Heyl J
(2023)
Data quality and autism: Issues and potential impacts
in International Journal of Medical Informatics
Stevenson P
(2020)
Internuclear potentials from the Sky3D code
in IOP SciNotes
Pamela S
(2022)
A generalised formulation of G-continuous Bezier elements applied to non-linear MHD simulations
in Journal of Computational Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part II. Initial conditions
in Journal of Cosmology and Astroparticle Physics
Hernández-Aguayo C
(2022)
Fast full N-body simulations of generic modified gravity: derivative coupling models
in Journal of Cosmology and Astroparticle Physics
Aurrekoetxea J
(2020)
The effects of potential shape on inhomogeneous inflation
in Journal of Cosmology and Astroparticle Physics
Balázs C
(2022)
Cosmological constraints on decaying axion-like particles: a global analysis
in Journal of Cosmology and Astroparticle Physics
Leo M
(2020)
Constraining structure formation using EDGES
in Journal of Cosmology and Astroparticle Physics
Bozorgnia N
(2020)
The dark matter component of the Gaia radially anisotropic substructure
in Journal of Cosmology and Astroparticle Physics
Muia F
(2019)
The fate of dense scalar stars
in Journal of Cosmology and Astroparticle Physics
Macpherson H
(2023)
Cosmological distances with general-relativistic ray tracing: framework and comparison to cosmographic predictions
in Journal of Cosmology and Astroparticle Physics
Pedersen C
(2020)
Massive neutrinos and degeneracies in Lyman-alpha forest simulations
in Journal of Cosmology and Astroparticle Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part I. Methodology and code description
in Journal of Cosmology and Astroparticle Physics
Heinesen A
(2022)
A prediction for anisotropies in the nearby Hubble flow
in Journal of Cosmology and Astroparticle Physics
Givans J
(2022)
Non-linearities in the Lyman-a forest and in its cross-correlation with dark matter halos
in Journal of Cosmology and Astroparticle Physics
Almaraz E
(2020)
Nonlinear structure formation in Bound Dark Energy
in Journal of Cosmology and Astroparticle Physics
Bozorgnia N
(2019)
On the correlation between the local dark matter and stellar velocities
in Journal of Cosmology and Astroparticle Physics
Aviles A
(2020)
Marked correlation functions in perturbation theory
in Journal of Cosmology and Astroparticle Physics
Shiraishi M
(2019)
General modal estimation for cross-bispectra
in Journal of Cosmology and Astroparticle Physics
Becker C
(2020)
Proca-stinated cosmology. Part I. A N -body code for the vector Galileon
in Journal of Cosmology and Astroparticle Physics
De Jong E
(2023)
Spinning primordial black holes formed during a matter-dominated era
in Journal of Cosmology and Astroparticle Physics
Clarke P
(2021)
Probing inflation with precision bispectra
in Journal of Cosmology and Astroparticle Physics
Poole-McKenzie R
(2020)
Informing dark matter direct detection limits with the ARTEMIS simulations
in Journal of Cosmology and Astroparticle Physics
Nazari Z
(2021)
Oscillon collapse to black holes
in Journal of Cosmology and Astroparticle Physics
Pedersen C
(2021)
An emulator for the Lyman-a forest in beyond-?CDM cosmologies
in Journal of Cosmology and Astroparticle Physics
Widdicombe J
(2020)
Black hole formation in relativistic Oscillaton collisions
in Journal of Cosmology and Astroparticle Physics
De Jong E
(2022)
Primordial black hole formation with full numerical relativity
in Journal of Cosmology and Astroparticle Physics
Hughes D
(2019)
Force balance in convectively driven dynamos with no inertia
in Journal of Fluid Mechanics
Robins L
(2020)
Viscous and inviscid strato-rotational instability
in Journal of Fluid Mechanics
Hori K
(2020)
Solitary magnetostrophic Rossby waves in spherical shells
in Journal of Fluid Mechanics
Mak M
(2023)
3D Simulations of the Archean Earth Including Photochemical Haze Profiles
in Journal of Geophysical Research: Atmospheres
Wakita S
(2022)
Effect of Impact Velocity and Angle on Deformational Heating and Postimpact Temperature
in Journal of Geophysical Research: Planets
Allanson O
(2020)
Particle-in-Cell Experiments Examine Electron Diffusion by Whistler-Mode Waves: 2. Quasi-Linear and Nonlinear Dynamics
in Journal of Geophysical Research: Space Physics
Allanson O
(2021)
Electron Diffusion and Advection During Nonlinear Interactions With Whistler-Mode Waves
in Journal of Geophysical Research: Space Physics
Allanson O
(2019)
Particle-in-cell Experiments Examine Electron Diffusion by Whistler-mode Waves: 1. Benchmarking With a Cold Plasma
in Journal of Geophysical Research: Space Physics
Cheung G
(2021)
DK I = 0, $$ D\overline{K} $$ I = 0, 1 scattering and the $$ {D}_{s0}^{\ast } $$(2317) from lattice QCD
in Journal of High Energy Physics
Ježo T
(2023)
Resonance-aware NLOPS matching for off-shell $$ t\overline{t} $$ + tW production with semileptonic decays
in Journal of High Energy Physics
Marolf D
(2019)
Phases of holographic Hawking radiation on spatially compact spacetimes
in Journal of High Energy Physics
Boyle P
(2023)
Isospin-breaking corrections to light-meson leptonic decays from lattice simulations at physical quark masses
in Journal of High Energy Physics
Czakon M
(2023)
Infrared-safe flavoured anti-kT jets
in Journal of High Energy Physics
Lindert J
(2023)
Precise predictions for V + 2 jet backgrounds in searches for invisible Higgs decays
in Journal of High Energy Physics
Cheung G
(2017)
Tetraquark operators in lattice QCD and exotic flavour states in the charm sector
in Journal of High Energy Physics
Ryan S
(2021)
Excited and exotic bottomonium spectroscopy from lattice QCD
in Journal of High Energy Physics
Bena I
(2019)
Holographic dual of hot Polchinski-Strassler quark-gluon plasma
in Journal of High Energy Physics
Andrade T
(2022)
Evidence for violations of Weak Cosmic Censorship in black hole collisions in higher dimensions
in Journal of High Energy Physics
Czakon M
(2023)
A detailed investigation of W+c-jet at the LHC
in Journal of High Energy Physics
Czakon M
(2023)
NNLO B-fragmentation fits and their application to $$ t\overline{t} $$ production and decay at the LHC
in Journal of High Energy Physics
Gavardi A
(2023)
NNLO+PS W+W- production using jet veto resummation at NNLL'
in Journal of High Energy Physics
Barone A
(2023)
Approaches to inclusive semileptonic B(s)-meson decays from Lattice QCD
in Journal of High Energy Physics