📣 Help Shape the Future of UKRI's Gateway to Research (GtR)

We're improving UKRI's Gateway to Research and are seeking your input! If you would be interested in being interviewed about the improvements we're making and to have your say about how we can make GtR more user-friendly, impactful, and effective for the Research and Innovation community, please email gateway@ukri.org.

Edinburgh DiRAC Resource Grant

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Physics and Astronomy

Abstract

DiRAC (Distributed Research utilising Advanced Computing) is the integrated supercomputing facility for theoretical modelling and HPC-based research in particle physics, nuclear physics, astronomy and cosmology, areas in which the UK is world-leading. It was funded as a result of investment of £12.32 million, from the Government's Large Facilities Capital Fund, together with investment from STFC and from universities. In 2012, the DiRAC facility was upgraded with a further £15 million capital investment from government (DiRAC-2).

The DiRAC facility provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. The science facilitated includes: using supercomputers to enable scientists to calculate what theories of the early universe predict and to test them against observations of the present universe; undertaking lattice field theory calculations whose primary aim is to increase the predictive power of the Standard Model of elementary particle interactions through numerical simulation of Quantum Chromodynamics; carrying out state-of-the-art cosmological simulations, including the large-scale distribution of dark matter, the formation of dark matter haloes, the formation and evolution of galaxies and clusters, the physics of the intergalactic medium and the properties of the intracluster gas.

This grant is to support the continued operation of the DiRAC facilities until 2017 to ensure that the UK remains one of the world-leaders of theoretical modelling in particle physics, astronomy and cosmology.

Planned Impact

The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.

DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.

Publications

10 25 50

publication icon
Ryan S (2021) Excited and exotic bottomonium spectroscopy from lattice QCD in Journal of High Energy Physics

publication icon
Ryan S (2021) Excited and exotic bottomonium spectroscopy from lattice QCD in Journal of High Energy Physics

publication icon
Yurchenko S (2020) ExoMol line lists - XL. Rovibrational molecular line list for the hydronium ion (H3O+) in Monthly Notices of the Royal Astronomical Society

publication icon
Owens A (2022) ExoMol line lists - XLVII. Rovibronic molecular line list of the calcium monohydroxide radical (CaOH) in Monthly Notices of the Royal Astronomical Society

publication icon
Yurchenko S (2020) ExoMol line lists - XXXIX. Ro-vibrational molecular line list for CO2 in Monthly Notices of the Royal Astronomical Society

publication icon
Yurchenko S (2020) ExoMol line lists - XXXVIII. High-temperature molecular line list of silicon dioxide (SiO2) in Monthly Notices of the Royal Astronomical Society

publication icon
Yurchenko S (2020) ExoMol molecular line lists - XXXVII. Spectra of acetylene in Monthly Notices of the Royal Astronomical Society

publication icon
Van Loon M (2021) Explaining the scatter in the galaxy mass-metallicity relation with gas flows in Monthly Notices of the Royal Astronomical Society

publication icon
Van Loon M (2021) Explaining the scatter in the galaxy mass-metallicity relation with gas flows in Monthly Notices of the Royal Astronomical Society

publication icon
Stafford S (2020) Exploring extensions to the standard cosmological model and the impact of baryons on small scales in Monthly Notices of the Royal Astronomical Society

publication icon
Van Daalen M (2020) Exploring the effects of galaxy formation on matter clustering through a library of simulation power spectra in Monthly Notices of the Royal Astronomical Society

publication icon
Cummins D (2022) Extreme pebble accretion in ringed protoplanetary discs in Monthly Notices of the Royal Astronomical Society

publication icon
Miles P (2020) Fallback Rates from Partial Tidal Disruption Events in The Astrophysical Journal

publication icon
Hernández-Aguayo C (2022) Fast full N-body simulations of generic modified gravity: derivative coupling models in Journal of Cosmology and Astroparticle Physics

publication icon
Oppenheimer B (2020) Feedback from supermassive black holes transforms centrals into passive galaxies by ejecting circumgalactic gas in Monthly Notices of the Royal Astronomical Society

publication icon
Kong S (2022) Filament formation via collision-induced magnetic reconnection - formation of a star cluster in Monthly Notices of the Royal Astronomical Society

publication icon
Cataldi P (2022) Fingerprints of modified gravity on galaxies in voids in Monthly Notices of the Royal Astronomical Society

publication icon
Hergt L (2022) Finite inflation in curved space in Physical Review D

publication icon
Aarts G (2016) Finite Temperature Lattice QCD --- Baryons in the Quark--Gluon Plasma in Acta Physica Polonica B Proceedings Supplement

publication icon
Wilkins S (2022) First Light and Reionisation Epoch Simulations (FLARES) - VI. The colour evolution of galaxies z  = 5-15 in Monthly Notices of the Royal Astronomical Society

publication icon
Changeat Q (2022) Five Key Exoplanet Questions Answered via the Analysis of 25 Hot-Jupiter Atmospheres in Eclipse in The Astrophysical Journal Supplement Series

publication icon
Quinn J (2022) Flute and kink instabilities in a dynamically twisted flux tube with anisotropic plasma viscosity in Monthly Notices of the Royal Astronomical Society

publication icon
Nealon R (2019) Flyby-induced misalignments in planet-hosting discs in Monthly Notices of the Royal Astronomical Society

publication icon
Clarke C (2020) Forbidden line diagnostics of photoevaporative disc winds in Monthly Notices of the Royal Astronomical Society

publication icon
Arnold C (2022) forge : the f ( R )-gravity cosmic emulator project - I. Introduction and matter power spectrum emulator in Monthly Notices of the Royal Astronomical Society

 
Description In December 2009, the STFC Facility, DiRAC, was established to provide distributed High Performance Computing (HPC) services for theoretical modelling and HPC-based research in particle physics, astronomy and cosmology. DiRAC provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. This grant funds the continued operation of the 1.3Pflop/s Blue Gene/Q system at the University of Edinburgh, which was co-developed by Peter Boyle (University of Edinburgh) and IBM to run with high energy efficiency for months at a time on a single problem to solve some of the most complex problems in physics, particularly the strong interactions of quarks and gluons. The DiRAC Facility supports over 250 active researchers at 27 UK HEIs. This includes the research projects of 100 funded research staff (PDRAs and Research Fellows), over 50 post-graduate projects, and £1.6M of funded research grants.
Exploitation Route Theoretical results obtained input to a range of experimental programmes aiming to increase our understanding of Nature. Algorithms and software developed provide input to computer design.
Sectors Digital/Communication/Information Technologies (including Software)

URL http://dirac.ac.uk/
 
Description Intel IPAG QCD codesign project 
Organisation Intel Corporation
Department Intel Corporation (Jones Farm)
Country United States 
Sector Private 
PI Contribution We have collaborated with Intel corporation since 2014 with $720k of total direct funding, starting initially as an Intel parallel computing centre, and expanding to direct close collaboration with Intel Pathfinding and Architecture Group.
Collaborator Contribution We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. We are also working with Intel verifying future architectures that will deliver the exascale performance in 2021.
Impact We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. This collaboration has been renewed annually in 2018, 2019, 2020. Two DiRAC RSE's were hired by Intel to work on the Turing collaboration.
Start Year 2016