DiRAC 2.5y - Networks and Data Management
Lead Research Organisation:
UNIVERSITY COLLEGE LONDON
Department Name: Physics and Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC2 HPC facility has been operating since 2012, providing computing resources for theoretical research
in all areas of particle physics, astronomy, cosmology and nuclear physics supported by STFC. It is a highly productive facility, generating 200-250 papers annually in international, peer-reviewed journals. However, the DiRAC facility risks becoming uncompetitive as it has remained static in terms of overall capability since 2012. The DiRAC-2.5x investment in 2017/18 mitigated the risk of hardware failures, by replacing our oldest hardware components. However, as the factor 5 oversubscription of the most recent RAC call demonstrated, the science programme in 2019/20 and beyond requires a significant uplift in DiRAC's compute capability. The main purpose of the requested funding for the DiRAC2.5y project is to provide a factor 2 increase in computing across all DiRAC services to enable the facility to remain competitive during 2019/20 in anticipation of future funding for DiRAC-3.
DiRAC2.5y builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting-edge research during 2019 in all areas of science supported by STFC. While the funding is required to remain competitive, the science programme will continue to be world-leading. Examples of the projects which will benefit from this investment include:
(i) lattice quantum chromodynamics (QCD) calculations of the properties of fundamental particles from first principles;
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery of new physics by increasing the accuracy of theoretical predictions for rare processes involving the fundamental constituents of matter known as quarks;
(iii) simulations of the merger of pairs of black holes amnwhich generate gravitational waves such as those recently discovered by the LIGO consortium;
(iv) the most realistic simulations to date of the formation and evolution of galaxies in the Universe;
(v) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine which drives galaxy evolution;
(vi) new models of our own Milky Way galaxy calibrated using new data from the European Space Agency's GAIA satellite;
(vii) detailed simulations of the interior of the sun and of planetary interiors;
(viii) the formation of stars in clusters - for the first time it will be possible to follow the formation of massive stars.
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC2 HPC facility has been operating since 2012, providing computing resources for theoretical research
in all areas of particle physics, astronomy, cosmology and nuclear physics supported by STFC. It is a highly productive facility, generating 200-250 papers annually in international, peer-reviewed journals. However, the DiRAC facility risks becoming uncompetitive as it has remained static in terms of overall capability since 2012. The DiRAC-2.5x investment in 2017/18 mitigated the risk of hardware failures, by replacing our oldest hardware components. However, as the factor 5 oversubscription of the most recent RAC call demonstrated, the science programme in 2019/20 and beyond requires a significant uplift in DiRAC's compute capability. The main purpose of the requested funding for the DiRAC2.5y project is to provide a factor 2 increase in computing across all DiRAC services to enable the facility to remain competitive during 2019/20 in anticipation of future funding for DiRAC-3.
DiRAC2.5y builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting-edge research during 2019 in all areas of science supported by STFC. While the funding is required to remain competitive, the science programme will continue to be world-leading. Examples of the projects which will benefit from this investment include:
(i) lattice quantum chromodynamics (QCD) calculations of the properties of fundamental particles from first principles;
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery of new physics by increasing the accuracy of theoretical predictions for rare processes involving the fundamental constituents of matter known as quarks;
(iii) simulations of the merger of pairs of black holes amnwhich generate gravitational waves such as those recently discovered by the LIGO consortium;
(iv) the most realistic simulations to date of the formation and evolution of galaxies in the Universe;
(v) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine which drives galaxy evolution;
(vi) new models of our own Milky Way galaxy calibrated using new data from the European Space Agency's GAIA satellite;
(vii) detailed simulations of the interior of the sun and of planetary interiors;
(viii) the formation of stars in clusters - for the first time it will be possible to follow the formation of massive stars.
Planned Impact
The anticipated impact of the DiRAC2.5y HPC facility aligns closely with the recently published UK Industrial Strategy. As such, many of our key impacts will be driven by our engagements with industry. Each service provider for DiRAC2.5y has a local industrial strategy to deliver increased levels of industrial returns over the next three years.
The "Pathways to impact" document which is attached to this proposal describes the overall industrial strategy for the DiRAC facility, including our strategic goals and key performance indicators.
The "Pathways to impact" document which is attached to this proposal describes the overall industrial strategy for the DiRAC facility, including our strategic goals and key performance indicators.
Organisations
Publications
Lawlor D
(2025)
Dense QC$_2$D: What's up with that?!?
Bignell R
(2025)
Anisotropic excited bottomonia from a basis of smeared operators
Allton C
(2024)
Thermal lattice QCD results from the FASTSUM collaboration
Forzano N
(2023)
Lattice studies of Sp(2N) gauge theories using GRID
Harrison J
(2025)
Update of HPQCD $B_c\to J/\psi$ Form Factors
Lachini N
(2024)
Towards a high-precision description of the ? and K* resonances
Horohan D'arcy R
(2025)
NRQCD Bottomonium at non-zero temperature using time-derivative moments
Itcovitz J
(2024)
The Distribution of Impactor Core Material During Large Impacts on Earth-like Planets
in The Planetary Science Journal
Dai K
(2024)
Impact Momentum Transfer-Insights from Numerical Simulation of Impacts on Large Boulders of Asteroids
in The Planetary Science Journal
Sorini D
(2024)
The impact of feedback on the evolution of gas density profiles from galaxies to clusters: a universal fitting formula from the Simba suite of simulations
in The Open Journal of Astrophysics
Oestreicher A
(2024)
Backreaction in Numerical Relativity: Averaging on Newtonian gauge-like hypersurfaces in Einstein Toolkit cosmological simulations
in The Open Journal of Astrophysics
Prole L
(2024)
Population III star formation: multiple gas phases prevent the use of an equation of state at high densities
in The Open Journal of Astrophysics
Clough K
(2024)
What no one has seen before: gravitational waveforms from warp drive collapse
in The Open Journal of Astrophysics
Lindert J
(2024)
Logarithmic EW corrections at one-loop
in The European Physical Journal C
Aarts G
(2024)
Non-zero temperature study of spin 1/2 charmed baryons using lattice gauge theory
in The European Physical Journal A
Ratnasingam R
(2024)
On the Geometry of the Near-core Magnetic Field in Massive Stars
in The Astrophysical Journal Letters
Banerjee A
(2024)
Atmospheric Retrievals Suggest the Presence of a Secondary Atmosphere and Possible Sulfur Species on L98-59 d from JWST Nirspec G395H Transmission Spectroscopy
in The Astrophysical Journal Letters
Scardoni C
(2024)
Seeing the Unseen: A Method to Detect Unresolved Rings in Protoplanetary Disks
in The Astrophysical Journal
Kesri K
(2024)
Dependence of Spicule Properties on the Magnetic Field-Results from Magnetohydrodynamics Simulations
in The Astrophysical Journal
Carlberg R
(2024)
Star Stream Velocity Distributions in Cold Dark Matter and Warm Dark Matter Galactic Halos
in The Astrophysical Journal
Varghese A
(2024)
Effect of Rotation on Wave Mixing in Intermediate-mass Stars
in The Astrophysical Journal
Sergeev D
(2024)
The Impact of the Explicit Representation of Convection on the Climate of a Tidally Locked Planet in Global Stretched-mesh Simulations
in The Astrophysical Journal
Farren G
(2024)
The Atacama Cosmology Telescope: Cosmology from Cross-correlations of unWISE Galaxies and ACT DR6 CMB Lensing
in The Astrophysical Journal
Regos E
(2024)
Percolation Statistics in the MillenniumTNG Simulations
in The Astrophysical Journal
Kong (??) S
(2024)
Filamentary Molecular Cloud Formation via Collision-induced Magnetic Reconnection in a Cold Neutral Medium
in The Astrophysical Journal
Martin-Alvarez S
(2024)
Extragalactic Magnetism with SOFIA (SALSA Legacy Program). VII. A Tomographic View of Far-infrared and Radio Polarimetric Observations through MHD Simulations of Galaxies
in The Astrophysical Journal
Eilers A
(2024)
EIGER. VI. The Correlation Function, Host Halo Mass, and Duty Cycle of Luminous Quasars at z ? 6
in The Astrophysical Journal
Garg P
(2024)
Theoretical Strong-line Metallicity Diagnostics for the JWST Era
in The Astrophysical Journal
Wyper P
(2024)
A Model for Flux Rope Formation and Disconnection in Pseudostreamer Coronal Mass Ejections
in The Astrophysical Journal
Mills B
(2024)
Spectral Calculations of 3D Radiation Magnetohydrodynamic Simulations of Super-Eddington Accretion onto a Stellar-mass Black Hole
in The Astrophysical Journal
Aslanyan V
(2024)
A New Field Line Tracer for the Study of Coronal Magnetic Topologies
in The Astrophysical Journal
Karpen J
(2024)
Solar Eruptions in Nested Magnetic Flux Systems
in The Astrophysical Journal
Qu F
(2024)
The Atacama Cosmology Telescope: A Measurement of the DR6 CMB Lensing Power Spectrum and Its Implications for Structure Growth
in The Astrophysical Journal
Forrest B
(2024)
MAGAZ3NE: Massive, Extremely Dusty Galaxies at z ~ 2 Lead to Photometric Overestimation of Number Densities of the Most Massive Galaxies at 3 < z < 4*
in The Astrophysical Journal
Tripathi B
(2024)
Predicting the Slowing of Stellar Differential Rotation by Instability-driven Turbulence
in The Astrophysical Journal
Murtas G
(2024)
Kink Instability of Flux Ropes in Partially Ionized Plasmas
in The Astrophysical Journal
Macpherson H
(2024)
The Impact of Anisotropic Sky Sampling on the Hubble Constant in Numerical Relativity
in The Astrophysical Journal
Householder A
(2024)
Investigating the Atmospheric Mass Loss of the Kepler-105 Planets Straddling the Radius Gap
in The Astronomical Journal
Sellek A
(2024)
Modeling JWST MIRI-MRS Observations of T Cha: Mid-IR Noble Gas Emission Tracing a Dense Disk Wind
in The Astronomical Journal
Berné O
(2024)
A far-ultraviolet-driven photoevaporation flow observed in a protoplanetary disk.
in Science (New York, N.Y.)
Alielden K
(2023)
ARTop: an open-source tool for measuring active region topology at the solar photosphere
in RAS Techniques and Instruments
Glasscock K
(2024)
Statistical estimation of full-sky radio maps from 21 cm array visibility data using Gaussian constrained realizations
in RAS Techniques and Instruments
Zhang M
(2025)
The Three Hundred project: The relationship between the shock and splashback radii of simulated galaxy clusters
in Publications of the Astronomical Society of Australia
| Title | Collaboration with Atempo |
| Description | Tape to Tape data transfter between DiRAC sites. |
| Type Of Technology | Software |
| Year Produced | 2019 |
| Open Source License? | Yes |
| Impact | Proof of COncept that data could be read from Tape stores remotely via a remote file system |
| Title | Fast Network Links for Durham and Cambridge Univeristies |
| Description | The Universeities and Cambridge are now linked by a highly performant Network |
| Type Of Technology | Physical Model/Kit |
| Year Produced | 2019 |
| Impact | Both HEIs are able to ingest data at a faster rate |
| Description | Member of UKRI E-Infrastructure Expert Panel 2017-2019 |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | Created 7 white papers for UKRI which detailed a Roadmap for future e-Infrastructure funding in the UK |
| Year(s) Of Engagement Activity | 2017,2018,2019 |
