DiRAC 2.5 Operations 2017-2020
Lead Research Organisation:
University of Cambridge
Department Name: Institute of Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed on understanding how the Universe works at a very fundamental level. The distance scales with which they work vary by 50 orders of magnitude from the smallest distances probed by experiments at the Large Hadron Collider, deep within the atomic nucleus, to the largest scale galaxy clusters discovered out in space. The science challenges, however, are linked through questions such as: How did the Universe begin and how is it evolving? and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Planned Impact
The expected impact of the DiRAC 2.5 HPC facility is fully described in the attached pathways to impact document and includes:
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
Organisations
Publications
Pastorek A
(2022)
Time-resolved fourier transform infrared emission spectroscopy of NH radical in the X3S- ground state
in Journal of Quantitative Spectroscopy and Radiative Transfer
Patsourakos S
(2020)
Decoding the Pre-Eruptive Magnetic Field Configurations of Coronal Mass Ejections
in Space Science Reviews
Pawlik M
(2019)
The diverse evolutionary pathways of post-starburst galaxies
in Nature Astronomy
Pedersen C
(2019)
Massive neutrinos and degeneracies in Lyman-alpha forest simulations
Pedersen C
(2020)
Massive neutrinos and degeneracies in Lyman-alpha forest simulations
in Journal of Cosmology and Astroparticle Physics
Pedersen C
(2023)
Compressing the Cosmological Information in One-dimensional Correlations of the Lyman-a Forest
in The Astrophysical Journal
Pedersen C
(2021)
An emulator for the Lyman-a forest in beyond-?CDM cosmologies
in Journal of Cosmology and Astroparticle Physics
Pellen M
(2022)
Angular coefficients in $$\hbox {W}+\hbox {j}$$ production at the LHC with high precision
in The European Physical Journal C
Pereira-Wilson M
(2023)
The cosmic UV background and the beginning and end of star formation in simulated field dwarf galaxies
in Monthly Notices of the Royal Astronomical Society
Pettini M
(2020)
A bound on the 12C/13C ratio in near-pristine gas with ESPRESSO
in Monthly Notices of the Royal Astronomical Society
Pezzella M
(2021)
A method for calculating temperature-dependent photodissociation cross sections and rates.
in Physical chemistry chemical physics : PCCP
Pfeffer J
(2022)
Using the EAGLE simulations to elucidate the origin of disc surface brightness profile breaks as a function of mass and environment
in Monthly Notices of the Royal Astronomical Society
Pfeffer J
(2019)
The evolution of the UV luminosity function of globular clusters in the E-MOSAICS simulations
in Monthly Notices of the Royal Astronomical Society
Pfeffer J
(2019)
Young star cluster populations in the E-MOSAICS simulations
in Monthly Notices of the Royal Astronomical Society
Pfeffer J
(2023)
Globular cluster metallicity distributions in the E-MOSAICS simulations
in Monthly Notices of the Royal Astronomical Society
Pfeffer J
(2020)
Predicting accreted satellite galaxy masses and accretion redshifts based on globular cluster orbits in the E-MOSAICS simulations
in Monthly Notices of the Royal Astronomical Society
Pfeifer S
(2020)
The bahamas project: effects of a running scalar spectral index on large-scale structure
in Monthly Notices of the Royal Astronomical Society
Pfeifer S
(2020)
The BAHAMAS project: effects of dynamical dark energy on large-scale structure
in Monthly Notices of the Royal Astronomical Society
Phillips M
(2020)
A new set of atmosphere and evolution models for cool T-Y brown dwarfs and giant exoplanets
in Astronomy & Astrophysics
Pichon C
(2020)
And yet it flips: connecting galactic spin and the cosmic web
in Monthly Notices of the Royal Astronomical Society
Pichon C
(2020)
Why do extremely massive disc galaxies exist today?
in Monthly Notices of the Royal Astronomical Society
Pierens A
(2023)
Three-dimensional evolution of radiative circumbinary discs: The size and shape of the inner cavity
in Astronomy & Astrophysics
Pimpanuwat B
(2020)
Maser flares driven by variations in pumping and background radiation
in Monthly Notices of the Royal Astronomical Society
Pinte C
(2020)
Rocking shadows in broken circumbinary discs
in Monthly Notices of the Royal Astronomical Society: Letters
Pittard J
(2019)
Momentum and energy injection by a supernova remnant into an inhomogeneous medium
in Monthly Notices of the Royal Astronomical Society
Ploeckinger S
(2024)
Resolution criteria to avoid artificial clumping in Lagrangian hydrodynamic simulations with a multiphase interstellar medium
in Monthly Notices of the Royal Astronomical Society
Ploeckinger S
(2020)
Radiative cooling rates, ion fractions, molecule abundances, and line emissivities including self-shielding and both local and metagalactic radiation fields
in Monthly Notices of the Royal Astronomical Society
Pontzen A
(2021)
EDGE: a new approach to suppressing numerical diffusion in adaptive mesh simulations of galaxy formation
in Monthly Notices of the Royal Astronomical Society
Poole-McKenzie R
(2020)
Informing dark matter direct detection limits with the ARTEMIS simulations
in Journal of Cosmology and Astroparticle Physics
Porth L
(2020)
Fast estimation of aperture mass statistics - I. Aperture mass variance and an application to the CFHTLenS data
in Monthly Notices of the Royal Astronomical Society
Potter M
(2019)
Forced magnetic reconnection and plasmoid coalescence I. Magnetohydrodynamic simulations
in Astronomy & Astrophysics
Power C
(2019)
nIFTy galaxy cluster simulations VI: the dynamical imprint of substructure on gaseous cluster outskirts.
in Monthly Notices of the Royal Astronomical Society
Pratt J
(2020)
Comparison of 2D and 3D compressible convection in a pre-main sequence star
in Astronomy & Astrophysics
Prgomet M
(2022)
EDGE: The sensitivity of ultra-faint dwarfs to a metallicity-dependent initial mass function
in Monthly Notices of the Royal Astronomical Society
Prole L
(2022)
Fragmentation-induced starvation in Population III star formation: a resolution study
in Monthly Notices of the Royal Astronomical Society
Prole L
(2022)
Primordial magnetic fields in Population III star formation: a magnetized resolution study
in Monthly Notices of the Royal Astronomical Society
Prole L
(2023)
From dark matter halos to pre-stellar cores: high resolution follow-up of cosmological Lyman-Werner simulations
in Monthly Notices of the Royal Astronomical Society
Puchwein E
(2023)
The Sherwood-Relics simulations: overview and impact of patchy reionization and pressure smoothing on the intergalactic medium
in Monthly Notices of the Royal Astronomical Society
Qiao L
(2022)
The evolution of protoplanetary discs in star formation and feedback simulations
in Monthly Notices of the Royal Astronomical Society
Quinn J
(2022)
Flute and kink instabilities in a dynamically twisted flux tube with anisotropic plasma viscosity
in Monthly Notices of the Royal Astronomical Society
Radia M
(2021)
Anomalies in the gravitational recoil of eccentric black-hole mergers with unequal mass ratios
in Physical Review D
Radia M
(2021)
Lessons for adaptive mesh refinement in numerical relativity
Radia M
(2022)
Lessons for adaptive mesh refinement in numerical relativity