DiRAC 2.5y - Networks and Data Management
Lead Research Organisation:
UNIVERSITY COLLEGE LONDON
Department Name: Physics and Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC2 HPC facility has been operating since 2012, providing computing resources for theoretical research
in all areas of particle physics, astronomy, cosmology and nuclear physics supported by STFC. It is a highly productive facility, generating 200-250 papers annually in international, peer-reviewed journals. However, the DiRAC facility risks becoming uncompetitive as it has remained static in terms of overall capability since 2012. The DiRAC-2.5x investment in 2017/18 mitigated the risk of hardware failures, by replacing our oldest hardware components. However, as the factor 5 oversubscription of the most recent RAC call demonstrated, the science programme in 2019/20 and beyond requires a significant uplift in DiRAC's compute capability. The main purpose of the requested funding for the DiRAC2.5y project is to provide a factor 2 increase in computing across all DiRAC services to enable the facility to remain competitive during 2019/20 in anticipation of future funding for DiRAC-3.
DiRAC2.5y builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting-edge research during 2019 in all areas of science supported by STFC. While the funding is required to remain competitive, the science programme will continue to be world-leading. Examples of the projects which will benefit from this investment include:
(i) lattice quantum chromodynamics (QCD) calculations of the properties of fundamental particles from first principles;
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery of new physics by increasing the accuracy of theoretical predictions for rare processes involving the fundamental constituents of matter known as quarks;
(iii) simulations of the merger of pairs of black holes amnwhich generate gravitational waves such as those recently discovered by the LIGO consortium;
(iv) the most realistic simulations to date of the formation and evolution of galaxies in the Universe;
(v) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine which drives galaxy evolution;
(vi) new models of our own Milky Way galaxy calibrated using new data from the European Space Agency's GAIA satellite;
(vii) detailed simulations of the interior of the sun and of planetary interiors;
(viii) the formation of stars in clusters - for the first time it will be possible to follow the formation of massive stars.
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC2 HPC facility has been operating since 2012, providing computing resources for theoretical research
in all areas of particle physics, astronomy, cosmology and nuclear physics supported by STFC. It is a highly productive facility, generating 200-250 papers annually in international, peer-reviewed journals. However, the DiRAC facility risks becoming uncompetitive as it has remained static in terms of overall capability since 2012. The DiRAC-2.5x investment in 2017/18 mitigated the risk of hardware failures, by replacing our oldest hardware components. However, as the factor 5 oversubscription of the most recent RAC call demonstrated, the science programme in 2019/20 and beyond requires a significant uplift in DiRAC's compute capability. The main purpose of the requested funding for the DiRAC2.5y project is to provide a factor 2 increase in computing across all DiRAC services to enable the facility to remain competitive during 2019/20 in anticipation of future funding for DiRAC-3.
DiRAC2.5y builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting-edge research during 2019 in all areas of science supported by STFC. While the funding is required to remain competitive, the science programme will continue to be world-leading. Examples of the projects which will benefit from this investment include:
(i) lattice quantum chromodynamics (QCD) calculations of the properties of fundamental particles from first principles;
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery of new physics by increasing the accuracy of theoretical predictions for rare processes involving the fundamental constituents of matter known as quarks;
(iii) simulations of the merger of pairs of black holes amnwhich generate gravitational waves such as those recently discovered by the LIGO consortium;
(iv) the most realistic simulations to date of the formation and evolution of galaxies in the Universe;
(v) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine which drives galaxy evolution;
(vi) new models of our own Milky Way galaxy calibrated using new data from the European Space Agency's GAIA satellite;
(vii) detailed simulations of the interior of the sun and of planetary interiors;
(viii) the formation of stars in clusters - for the first time it will be possible to follow the formation of massive stars.
Planned Impact
The anticipated impact of the DiRAC2.5y HPC facility aligns closely with the recently published UK Industrial Strategy. As such, many of our key impacts will be driven by our engagements with industry. Each service provider for DiRAC2.5y has a local industrial strategy to deliver increased levels of industrial returns over the next three years.
The "Pathways to impact" document which is attached to this proposal describes the overall industrial strategy for the DiRAC facility, including our strategic goals and key performance indicators.
The "Pathways to impact" document which is attached to this proposal describes the overall industrial strategy for the DiRAC facility, including our strategic goals and key performance indicators.
Organisations
Publications
Ho C
(2024)
Shallower radius valley around low-mass hosts: evidence for icy planets, collisions, or high-energy radiation scatter
in Monthly Notices of the Royal Astronomical Society
Yurchenko S
(2024)
ExoMol line lists - LX. Molecular line list for the ammonia isotopologue 15NH3
in Monthly Notices of the Royal Astronomical Society
Oman K
(2024)
A warm dark matter cosmogony may yield more low-mass galaxy detections in 21-cm surveys than a cold dark matter one
in Monthly Notices of the Royal Astronomical Society
Georgy C
(2024)
3D simulations of a neon burning convective shell in a massive star
in Monthly Notices of the Royal Astronomical Society
Collins C
(2024)
Towards inferring the geometry of kilonovae
in Monthly Notices of the Royal Astronomical Society
Pollin J
(2024)
On the fate of the secondary white dwarf in double-degenerate double-detonation Type Ia supernovae - II. 3D synthetic observables
in Monthly Notices of the Royal Astronomical Society
Turpin G
(2024)
Orbital evolution of close binary systems: comparing viscous and wind-driven circumbinary disc models
in Monthly Notices of the Royal Astronomical Society
Ziampras A
(2024)
Migration of low-mass planets in inviscid discs: the effect of radiation transport on the dynamical corotation torque
in Monthly Notices of the Royal Astronomical Society
Elsender D
(2024)
An implicit algorithm for simulating the dynamics of small dust grains with smoothed particle hydrodynamics
in Monthly Notices of the Royal Astronomical Society
Penzlin A
(2024)
Viscous circumbinary protoplanetary discs - I. Structure of the inner cavity
in Monthly Notices of the Royal Astronomical Society
| Title | Collaboration with Atempo |
| Description | Tape to Tape data transfter between DiRAC sites. |
| Type Of Technology | Software |
| Year Produced | 2019 |
| Open Source License? | Yes |
| Impact | Proof of COncept that data could be read from Tape stores remotely via a remote file system |
| Title | Fast Network Links for Durham and Cambridge Univeristies |
| Description | The Universeities and Cambridge are now linked by a highly performant Network |
| Type Of Technology | Physical Model/Kit |
| Year Produced | 2019 |
| Impact | Both HEIs are able to ingest data at a faster rate |
| Description | Member of UKRI E-Infrastructure Expert Panel 2017-2019 |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | Created 7 white papers for UKRI which detailed a Roadmap for future e-Infrastructure funding in the UK |
| Year(s) Of Engagement Activity | 2017,2018,2019 |
