DiRAC 2.5 Operations 2017-2020
Lead Research Organisation:
University of Cambridge
Department Name: Institute of Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed on understanding how the Universe works at a very fundamental level. The distance scales with which they work vary by 50 orders of magnitude from the smallest distances probed by experiments at the Large Hadron Collider, deep within the atomic nucleus, to the largest scale galaxy clusters discovered out in space. The science challenges, however, are linked through questions such as: How did the Universe begin and how is it evolving? and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Planned Impact
The expected impact of the DiRAC 2.5 HPC facility is fully described in the attached pathways to impact document and includes:
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
Organisations
Publications
Sykes C
(2019)
Fluorescent rings in star-free dark matter haloes
in Monthly Notices of the Royal Astronomical Society
Quinn J
(2022)
Flute and kink instabilities in a dynamically twisted flux tube with anisotropic plasma viscosity
in Monthly Notices of the Royal Astronomical Society
Nealon R
(2019)
Flyby-induced misalignments in planet-hosting discs
in Monthly Notices of the Royal Astronomical Society
Hughes D
(2019)
Force balance in convectively driven dynamos with no inertia
in Journal of Fluid Mechanics
Potter M
(2019)
Forced magnetic reconnection and plasmoid coalescence I. Magnetohydrodynamic simulations
in Astronomy & Astrophysics
Arnold C
(2022)
forge : the f ( R )-gravity cosmic emulator project - I. Introduction and matter power spectrum emulator
in Monthly Notices of the Royal Astronomical Society
Cooper L.
(2022)
Form factors for the decay processes B+c ? D0l+?l and B+c ? D+s l+l- from lattice QCD
in Proceedings of Science
Cooper L
(2022)
Form factors for the processes B c + ? D 0 l + ? l and B c + ? D s + l + l - ( ? ? ¯ ) from lattice QCD
in Physical Review D
Reina-Campos M
(2019)
Formation histories of stars, clusters, and globular clusters in the E-MOSAICS simulations
in Monthly Notices of the Royal Astronomical Society
Fyfe L
(2021)
Forward modelling of heating within a coronal arcade
in Astronomy & Astrophysics
Prole L
(2022)
Fragmentation-induced starvation in Population III star formation: a resolution study
in Monthly Notices of the Royal Astronomical Society
Al-Refaie A
(2022)
FRECKLL: Full and Reduced Exoplanet Chemical Kinetics distiLLed
Prole L
(2023)
From dark matter halos to pre-stellar cores: high resolution follow-up of cosmological Lyman-Werner simulations
in Monthly Notices of the Royal Astronomical Society
Garratt-Smithson L
(2019)
Galactic chimney sweeping: the effect of 'gradual' stellar feedback mechanisms on the evolution of dwarf galaxies
in Monthly Notices of the Royal Astronomical Society
Mitchell P
(2020)
Galactic inflow and wind recycling rates in the eagle simulations
in Monthly Notices of the Royal Astronomical Society
Fiacconi D
(2018)
Galactic nuclei evolution with spinning black holes: method and implementation
in Monthly Notices of the Royal Astronomical Society
Mitchell P
(2020)
Galactic outflow rates in the EAGLE simulations
in Monthly Notices of the Royal Astronomical Society
Forouhar Moreno V
(2022)
Galactic satellite systems in CDM, WDM and SIDM
in Monthly Notices of the Royal Astronomical Society
Haynes C
(2019)
Galactic simulations of r-process elemental abundances
in Monthly Notices of the Royal Astronomical Society
Borukhovetskaya A
(2022)
Galactic tides and the Crater II dwarf spheroidal: a challenge to LCDM?
in Monthly Notices of the Royal Astronomical Society
Altamura E
(2023)
Galaxy cluster rotation revealed in the MACSIS simulations with the kinetic Sunyaev-Zeldovich effect
in Monthly Notices of the Royal Astronomical Society
Cuesta-Lazaro C
(2023)
Galaxy clustering from the bottom up: a streaming model emulator I
in Monthly Notices of the Royal Astronomical Society
Davé R
(2020)
Galaxy cold gas contents in modern cosmological hydrodynamic simulations
in Monthly Notices of the Royal Astronomical Society
Baugh C
(2019)
Galaxy formation in the Planck Millennium: the atomic hydrogen content of dark matter haloes
in Monthly Notices of the Royal Astronomical Society
McAlpine S
(2020)
Galaxy mergers in eagle do not induce a significant amount of black hole growth yet do increase the rate of luminous AGN
in Monthly Notices of the Royal Astronomical Society
Xu W
(2020)
Galaxy properties in the cosmic web of EAGLE simulation
in Monthly Notices of the Royal Astronomical Society
Nightingale J
(2019)
Galaxy structure with strong gravitational lensing: decomposing the internal mass distribution of massive elliptical galaxies
in Monthly Notices of the Royal Astronomical Society
He Q
(2022)
Galaxy-galaxy strong lens perturbations: line-of-sight haloes versus lens subhaloes
in Monthly Notices of the Royal Astronomical Society
Nelson R
(2023)
Gas accretion onto Jupiter mass planets in discs with laminar accretion flows
in Astronomy & Astrophysics
Towler I
(2023)
Gas clumping and its effect on hydrostatic bias in the MACSIS simulations
in Monthly Notices of the Royal Astronomical Society
Shiraishi M
(2019)
General modal estimation for cross-bispectra
in Journal of Cosmology and Astroparticle Physics
Hannaford-Gunn A
(2022)
Generalized parton distributions from the off-forward Compton amplitude in lattice QCD
in Physical Review D
Currie L
(2020)
Generation of shear flows and vortices in rotating anelastic convection
in Physical Review Fluids
Beraldo e Silva L
(2020)
Geometric properties of galactic discs with clumpy episodes
in Monthly Notices of the Royal Astronomical Society
Clough K
(2022)
Ghost Instabilities in Self-Interacting Vector Fields: The Problem with Proca Fields
in Physical Review Letters
Wang C
(2023)
Ghostly Galaxies: Accretion-dominated Stellar Systems in Low-mass Dark Matter Halos
in The Astrophysical Journal
Huang J
(2023)
Global 3D Radiation Magnetohydrodynamic Simulations of Accretion onto a Stellar-mass Black Hole at Sub- and Near-critical Accretion Rates
in The Astrophysical Journal
Chang C
(2023)
Global fits of simplified models for dark matter with GAMBIT
Chang C
(2023)
Global fits of simplified models for dark matter with GAMBIT I. Scalar and fermionic models with s-channel vector mediators
in The European Physical Journal C
Chang C
(2023)
Global fits of simplified models for dark matter with GAMBIT II. Vector dark matter with an s-channel vector mediator
in The European Physical Journal C
Pfeffer J
(2023)
Globular cluster metallicity distributions in the E-MOSAICS simulations
in Monthly Notices of the Royal Astronomical Society
Bennett E
(2021)
Glueballs and strings in S p ( 2 N ) Yang-Mills theories
in Physical Review D
Kukstas E
(2023)
GOGREEN: A critical assessment of environmental trends in cosmological hydrodynamical simulations at z ˜ 1
in Monthly Notices of the Royal Astronomical Society