DiRAC 2.5 Operations 2017-2020
Lead Research Organisation:
University of Cambridge
Department Name: Institute of Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed on understanding how the Universe works at a very fundamental level. The distance scales with which they work vary by 50 orders of magnitude from the smallest distances probed by experiments at the Large Hadron Collider, deep within the atomic nucleus, to the largest scale galaxy clusters discovered out in space. The science challenges, however, are linked through questions such as: How did the Universe begin and how is it evolving? and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Progress requires new astronomical observations and experimental data but also new theoretical insights. Theoretical understanding comes increasingly from large-scale computations that allow us to confront the consequences of our theories very accurately with the data or allow us to interrogate the data in detail to extract information that has impact on our theories. These computations test the fastest computers that we have and push the boundaries of technology in this sector. They also provide an excellent environment for training students in state-of-the-art techniques for code optimisation and data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed to support cutting edge research during 2017 in all areas of science supported by STFC.
In addition to the existing DiRAC-2 services, from April 2017 DiRAC-2.5 will provide:
1) A factor 2 increase in the computational power of the DiRAC supercomputer at the University of Durham, which is designed for simulations requiring large amounts of computer memory. The enhanced system will be used to:
(i) simulate the merger of pairs of black holes which generate gravitational waves such as those recently discovered by the LIGO consortium;
(ii) perform the most realistic simulations to date of the formation and evolution of galaxies in the Universe
(iii) carry out detailed simulations of the interior of the sun and of planetary interiors.
2) A new High Performance Computer at Cambridge whose particular architecture is well suited to the theoretical problems that we want to tackle that utilise large amounts of data, either as input or being generated at intermediate stages of our calculations. Two key challenges that we will tackle are those of:
(i) improving our understanding of the Milky Way through analysis of new data from the European
Space Agency's GAIA satellite and
(ii) improving the potential of experiments at CERN's Large Hadron Collider for discovery
of new physics by increasing the accuracy of theoretical predictions for rare processes involving the
fundamental constituents of matter known as quarks.
3) An additional 3500 compute cores on the DiRAC Complexity supercomputer at Leicester which will make it possible to
carry out simulations of some of the most complex physical situation in the Universe. These include:
(i) the formation of stars in clusters - for the first time it will be possible to follow the formation of stars many times more massive than the sun;
(ii) the accretion of gas onto supermassive black holes, the most efficient means of extracting energy from matter and the engine
which drives galaxy formation and evolution.
4) A team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Planned Impact
The expected impact of the DiRAC 2.5 HPC facility is fully described in the attached pathways to impact document and includes:
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
Organisations
Publications
Khachaturyants T
(2021)
How stars formed in warps settle into (and contaminate) thick discs
in Monthly Notices of the Royal Astronomical Society
Khachaturyants T
(2022)
Bending waves excited by irregular gas inflow along warps
in Monthly Notices of the Royal Astronomical Society
Khachaturyants T
(2022)
The pattern speeds of vertical breathing waves
in Monthly Notices of the Royal Astronomical Society: Letters
Khan S
(2021)
Gravitational-wave surrogate models powered by artificial neural networks
in Physical Review D
Kimm T
(2019)
Understanding the escape of LyC and Lya photons from turbulent clouds
in Monthly Notices of the Royal Astronomical Society
Kimm T
(2022)
A Systematic Study of the Escape of LyC and Lya Photons from Star-forming, Magnetized Turbulent Clouds
in The Astrophysical Journal Supplement Series
Kirchschlager F
(2020)
Silicate Grain Growth due to Ion Trapping in Oxygen-rich Supernova Remnants like Cassiopeia A
in The Astrophysical Journal
Kirchschlager F
(2023)
Dust survival rates in clumps passing through the Cas A reverse shock - II. The impact of magnetic fields
in Monthly Notices of the Royal Astronomical Society
Kirchschlager F
(2019)
Dust survival rates in clumps passing through the Cas A reverse shock - I. Results for a range of clump densities
in Monthly Notices of the Royal Astronomical Society
Kobayashi C
(2020)
The Origin of Elements from Carbon to Uranium
in The Astrophysical Journal
Kobayashi C
(2020)
Stellar migrations and metal flows - Chemical evolution of the thin disc of a simulated Milky Way analogous galaxy
in Monthly Notices of the Royal Astronomical Society
Kobayashi C
(2020)
New Type Ia Supernova Yields and the Manganese and Nickel Problems in the Milky Way and Dwarf Spheroidal Galaxies
in The Astrophysical Journal
Koiwai T
(2022)
A first glimpse at the shell structure beyond 54Ca: Spectroscopy of 55K, 55Ca, and 57Ca
in Physics Letters B
Komissarov S
(2019)
Magnetic inhibition of centrifugal instability
in Monthly Notices of the Royal Astronomical Society
Kong S
(2022)
Filament formation via collision-induced magnetic reconnection - formation of a star cluster
in Monthly Notices of the Royal Astronomical Society
Koponen J
(2022)
Properties of low-lying charmonia and bottomonia from lattice QCD + QED
in Suplemento de la Revista Mexicana de Física
Kordov Z
(2020)
Electromagnetic contribution to S - ? mixing using lattice QCD + QED
in Physical Review D
Koudmani S
(2018)
Fast and energetic AGN-driven outflows in simulated dwarf galaxies
Koudmani S
(2019)
Fast and energetic AGN-driven outflows in simulated dwarf galaxies
Koudmani S
(2022)
Two can play at that game: constraining the role of supernova and AGN feedback in dwarf galaxies with cosmological zoom-in simulations
in Monthly Notices of the Royal Astronomical Society
Kozyreva A
(2020)
The influence of line opacity treatment in stella on supernova light curves
in Monthly Notices of the Royal Astronomical Society
Kroupa N
(2024)
Kernel-, mean-, and noise-marginalized Gaussian processes for exoplanet transits and H 0 inference
in Monthly Notices of the Royal Astronomical Society
Kruijssen J
(2020)
Kraken reveals itself - the merger history of the Milky Way reconstructed with the E-MOSAICS simulations
in Monthly Notices of the Royal Astronomical Society
Kruijssen J
(2019)
The E-MOSAICS project: tracing galaxy formation and assembly with the age-metallicity distribution of globular clusters
in Monthly Notices of the Royal Astronomical Society
Kugel R
(2023)
FLAMINGO: calibrating large cosmological hydrodynamical simulations with machine learning.
in Monthly notices of the Royal Astronomical Society
Kukstas E
(2020)
Environment from cross-correlations: connecting hot gas and the quenching of galaxies
in Monthly Notices of the Royal Astronomical Society
Kukstas E
(2023)
GOGREEN: A critical assessment of environmental trends in cosmological hydrodynamical simulations at z ˜ 1
in Monthly Notices of the Royal Astronomical Society
Kulkarni G
(2019)
Large Ly a opacity fluctuations and low CMB t in models of late reionization with large islands of neutral hydrogen extending to z < 5.5
in Monthly Notices of the Royal Astronomical Society: Letters
Lach F
(2022)
Models of pulsationally assisted gravitationally confined detonations with different ignition conditions
in Astronomy & Astrophysics
Lach F
(2022)
Type Iax supernovae from deflagrations in Chandrasekhar mass white dwarfs
in Astronomy & Astrophysics
Laitinen T
(2023)
An Analytical Model of Turbulence in Parker Spiral Geometry and Associated Magnetic Field Line Lengths
in The Astrophysical Journal
Laitinen T
(2023)
Solar energetic particle event onsets at different heliolongitudes: The effect of turbulence in Parker spiral geometry
in Astronomy & Astrophysics
Lamberts A
(2022)
Constraining blazar heating with the 2 ? z ? 3 Lyman-a forest
in Monthly Notices of the Royal Astronomical Society
Lang N
(2022)
Axial-Vector D_{1} Hadrons in D^{*}p Scattering from QCD.
in Physical review letters
Lawlor D
(2022)
Thermal Transitions in Dense Two-Colour QCD
in EPJ Web of Conferences
Le Saux A
(2022)
Two-dimensional simulations of solar-like models with artificially enhanced luminosity II. Impact on internal gravity waves
in Astronomy & Astrophysics
Lee E
(2022)
A multisimulation study of relativistic SZ temperature scalings in galaxy clusters and groups
in Monthly Notices of the Royal Astronomical Society
Lee J
(2020)
Dual Effects of Ram Pressure on Star Formation in Multiphase Disk Galaxies with Strong Stellar Feedback
in The Astrophysical Journal
Lee J
(2022)
Simulating Jellyfish Galaxies: A Case Study for a Gas-rich Dwarf Galaxy
in The Astrophysical Journal