Edinburgh DiRAC Resource Grant
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Physics and Astronomy
Abstract
DiRAC (Distributed Research utilising Advanced Computing) is the integrated supercomputing facility for theoretical modelling and HPC-based research in particle physics, nuclear physics, astronomy and cosmology, areas in which the UK is world-leading. It was funded as a result of investment of £12.32 million, from the Government's Large Facilities Capital Fund, together with investment from STFC and from universities. In 2012, the DiRAC facility was upgraded with a further £15 million capital investment from government (DiRAC-2).
The DiRAC facility provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. The science facilitated includes: using supercomputers to enable scientists to calculate what theories of the early universe predict and to test them against observations of the present universe; undertaking lattice field theory calculations whose primary aim is to increase the predictive power of the Standard Model of elementary particle interactions through numerical simulation of Quantum Chromodynamics; carrying out state-of-the-art cosmological simulations, including the large-scale distribution of dark matter, the formation of dark matter haloes, the formation and evolution of galaxies and clusters, the physics of the intergalactic medium and the properties of the intracluster gas.
This grant is to support the continued operation of the DiRAC facilities until 2017 to ensure that the UK remains one of the world-leaders of theoretical modelling in particle physics, astronomy and cosmology.
The DiRAC facility provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. The science facilitated includes: using supercomputers to enable scientists to calculate what theories of the early universe predict and to test them against observations of the present universe; undertaking lattice field theory calculations whose primary aim is to increase the predictive power of the Standard Model of elementary particle interactions through numerical simulation of Quantum Chromodynamics; carrying out state-of-the-art cosmological simulations, including the large-scale distribution of dark matter, the formation of dark matter haloes, the formation and evolution of galaxies and clusters, the physics of the intergalactic medium and the properties of the intracluster gas.
This grant is to support the continued operation of the DiRAC facilities until 2017 to ensure that the UK remains one of the world-leaders of theoretical modelling in particle physics, astronomy and cosmology.
Planned Impact
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
People |
ORCID iD |
Richard Kenway (Principal Investigator) | |
Peter Boyle (Co-Investigator) |
Publications
Mitchell P
(2022)
How gas flows shape the stellar-halo mass relation in the eagle simulation
in Monthly Notices of the Royal Astronomical Society
Ho S
(2021)
How Identifying Circumgalactic Gas by Line-of-sight Velocity instead of the Location in 3D Space Affects O vi Measurements
in The Astrophysical Journal
Threlfall J
(2020)
How Is Helicity (and Twist) Partitioned in Magnetohydrodynamic Simulations of Reconnecting Magnetic Flux Tubes?
in The Astrophysical Journal
Gómez-Guijarro C
(2020)
How primordial magnetic fields shrink galaxies
in Monthly Notices of the Royal Astronomical Society
Khachaturyants T
(2021)
How stars formed in warps settle into (and contaminate) thick discs
in Monthly Notices of the Royal Astronomical Society
Garzilli A
(2021)
How to constrain warm dark matter with the Lyman-a forest
in Monthly Notices of the Royal Astronomical Society
Slyz A
(2020)
How to quench a dwarf galaxy: The impact of inhomogeneous reionization on dwarf galaxies and cosmic filaments
in Monthly Notices of the Royal Astronomical Society
Evans T
(2020)
How unusual is the Milky Way's assembly history?
in Monthly Notices of the Royal Astronomical Society
Lower S
(2020)
How Well Can We Measure the Stellar Mass of a Galaxy: The Impact of the Assumed Star Formation History Model in SED Fitting
in The Astrophysical Journal
Hou J
(2021)
How well is angular momentum accretion modelled in semi-analytic galaxy formation models?
in Monthly Notices of the Royal Astronomical Society
Edwards B
(2020)
Hubble WFC3 Spectroscopy of the Habitable-zone Super-Earth LHS 1140 b
in The Astronomical Journal
Moews B
(2021)
Hybrid analytic and machine-learned baryonic property insertion into galactic dark matter haloes
in Monthly Notices of the Royal Astronomical Society
Ziampras A
(2023)
Hydrodynamic turbulence in disks with embedded planets
in Astronomy & Astrophysics
Pagano P
(2020)
Hydrogen non-equilibrium ionisation effects in coronal mass ejections
in Astronomy & Astrophysics
Bosman S
(2022)
Hydrogen reionization ends by z = 5.3: Lyman-a optical depth measured by the XQR-30 sample
in Monthly Notices of the Royal Astronomical Society
Pearce F
(2020)
Hydrostatic mass estimates of massive galaxy clusters: a study with varying hydrodynamics flavours and non-thermal pressure support
in Monthly Notices of the Royal Astronomical Society
Kegerreis J
(2022)
Immediate Origin of the Moon as a Post-impact Satellite
in The Astrophysical Journal Letters
Hutchinson A
(2023)
Impact of corotation on gradual solar energetic particle event intensity profiles
in Astronomy & Astrophysics
Vlaykov D
(2022)
Impact of radial truncation on global 2D hydrodynamic simulations for a Sun-like model
in Monthly Notices of the Royal Astronomical Society
Han D
(2022)
Impact of Radiation Feedback on the Formation of Globular Cluster Candidates during Cloud-Cloud Collisions
in The Astrophysical Journal
Eager-Nash J
(2020)
Implications of different stellar spectra for the climate of tidally locked Earth-like exoplanets
in Astronomy & Astrophysics
Raste J
(2021)
Implications of the z > 5 Lyman-a forest for the 21-cm power spectrum from the epoch of reionization
in Monthly Notices of the Royal Astronomical Society
Drummond B
(2020)
Implications of three-dimensional chemical transport in hot Jupiter atmospheres: Results from a consistently coupled chemistry-radiation-hydrodynamics model
in Astronomy & Astrophysics
Cho Y
(2015)
Improved lattice fermion action for heavy quarks
in Journal of High Energy Physics
Chakraborty B
(2021)
Improved V c s determination using precise lattice QCD form factors for D ? K l ?
in Physical Review D
Donevski D
(2020)
In pursuit of giants I. The evolution of the dust-to-stellar mass ratio in distant dusty galaxies
in Astronomy & Astrophysics
Borrow J
(2021)
Inconsistencies arising from the coupling of galaxy formation sub-grid models to pressure-smoothed particle hydrodynamics
in Monthly Notices of the Royal Astronomical Society
Falck B
(2021)
Indra: a public computationally accessible suite of cosmological N -body simulations
in Monthly Notices of the Royal Astronomical Society
Poole-McKenzie R
(2020)
Informing dark matter direct detection limits with the ARTEMIS simulations
in Journal of Cosmology and Astroparticle Physics
Vandenbroucke B
(2020)
Infrared luminosity functions and dust mass functions in the EAGLE simulation
in Monthly Notices of the Royal Astronomical Society
Kupilas M
(2021)
Interactions of a shock with a molecular cloud at various stages of its evolution due to thermal instability and gravity
in Monthly Notices of the Royal Astronomical Society
Buie E
(2020)
Interpreting Observations of Absorption Lines in the Circumgalactic Medium with a Turbulent Medium
in The Astrophysical Journal
Hill A
(2022)
Intrinsic alignments of the extended radio continuum emission of galaxies in the EAGLE simulations
in Monthly Notices of the Royal Astronomical Society
Vizgan D
(2022)
Investigating the [C ii]-to-H i Conversion Factor and the H i Gas Budget of Galaxies at z ˜ 6 with Hydrodynamic Simulations
in The Astrophysical Journal Letters
Kalaghatgi C
(2021)
Investigating the effect of in-plane spin directions for precessing binary black hole systems
in Physical Review D
Linh B
(2021)
Investigation of the ground-state spin inversion in the neutron-rich Cl 47 , 49 isotopes
in Physical Review C
Scardoni C
(2022)
Inward and outward migration of massive planets: moving towards a stalling radius
in Monthly Notices of the Royal Astronomical Society
Hellinger P
(2022)
Ion-scale Transition of Plasma Turbulence: Pressure-Strain Effect
in The Astrophysical Journal
Dickey C
(2021)
IQ Collaboratory. II. The Quiescent Fraction of Isolated, Low-mass Galaxies across Simulations and Observations
in The Astrophysical Journal
Whitworth D
(2022)
Is the molecular KS relationship universal down to low metallicities?
in Monthly Notices of the Royal Astronomical Society
Boyle P
(2017)
Isospin breaking corrections to meson masses and the hadronic vacuum polarization: a comparative study
in Journal of High Energy Physics
Boyle P
(2023)
Isospin-breaking corrections to light-meson leptonic decays from lattice simulations at physical quark masses
in Journal of High Energy Physics
Wang Y
(2020)
Iterative removal of redshift-space distortions from galaxy clustering
in Monthly Notices of the Royal Astronomical Society
Christiansen J
(2020)
Jet feedback and the photon underproduction crisis in simba
in Monthly Notices of the Royal Astronomical Society
Blum T
(2015)
K ? p p ? I = 3 / 2 decay amplitude in the continuum limit
in Physical Review D
Janowski T
(2015)
K-pi scattering lengths at physical kinematics
Changeat Q
(2020)
KELT-11 b: Abundances of Water and Constraints on Carbon-bearing Molecules from the Hubble Transmission Spectrum
in The Astronomical Journal
Joudaki S
(2020)
KiDS+VIKING-450 and DES-Y1 combined: Cosmology with cosmic shear
in Astronomy & Astrophysics
Hildebrandt H
(2020)
KiDS+VIKING-450: Cosmic shear tomography with optical and infrared data
in Astronomy & Astrophysics
Description | In December 2009, the STFC Facility, DiRAC, was established to provide distributed High Performance Computing (HPC) services for theoretical modelling and HPC-based research in particle physics, astronomy and cosmology. DiRAC provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. This grant funds the continued operation of the 1.3Pflop/s Blue Gene/Q system at the University of Edinburgh, which was co-developed by Peter Boyle (University of Edinburgh) and IBM to run with high energy efficiency for months at a time on a single problem to solve some of the most complex problems in physics, particularly the strong interactions of quarks and gluons. The DiRAC Facility supports over 250 active researchers at 27 UK HEIs. This includes the research projects of 100 funded research staff (PDRAs and Research Fellows), over 50 post-graduate projects, and £1.6M of funded research grants. |
Exploitation Route | Theoretical results obtained input to a range of experimental programmes aiming to increase our understanding of Nature. Algorithms and software developed provide input to computer design. |
Sectors | Digital/Communication/Information Technologies (including Software) |
URL | http://dirac.ac.uk/ |
Description | Intel IPAG QCD codesign project |
Organisation | Intel Corporation |
Department | Intel Corporation (Jones Farm) |
Country | United States |
Sector | Private |
PI Contribution | We have collaborated with Intel corporation since 2014 with $720k of total direct funding, starting initially as an Intel parallel computing centre, and expanding to direct close collaboration with Intel Pathfinding and Architecture Group. |
Collaborator Contribution | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. We are also working with Intel verifying future architectures that will deliver the exascale performance in 2021. |
Impact | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. This collaboration has been renewed annually in 2018, 2019, 2020. Two DiRAC RSE's were hired by Intel to work on the Turing collaboration. |
Start Year | 2016 |