Edinburgh DiRAC Resource Grant
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Physics and Astronomy
Abstract
DiRAC (Distributed Research utilising Advanced Computing) is the integrated supercomputing facility for theoretical modelling and HPC-based research in particle physics, nuclear physics, astronomy and cosmology, areas in which the UK is world-leading. It was funded as a result of investment of £12.32 million, from the Government's Large Facilities Capital Fund, together with investment from STFC and from universities. In 2012, the DiRAC facility was upgraded with a further £15 million capital investment from government (DiRAC-2).
The DiRAC facility provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. The science facilitated includes: using supercomputers to enable scientists to calculate what theories of the early universe predict and to test them against observations of the present universe; undertaking lattice field theory calculations whose primary aim is to increase the predictive power of the Standard Model of elementary particle interactions through numerical simulation of Quantum Chromodynamics; carrying out state-of-the-art cosmological simulations, including the large-scale distribution of dark matter, the formation of dark matter haloes, the formation and evolution of galaxies and clusters, the physics of the intergalactic medium and the properties of the intracluster gas.
This grant is to support the continued operation of the DiRAC facilities until 2017 to ensure that the UK remains one of the world-leaders of theoretical modelling in particle physics, astronomy and cosmology.
The DiRAC facility provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. The science facilitated includes: using supercomputers to enable scientists to calculate what theories of the early universe predict and to test them against observations of the present universe; undertaking lattice field theory calculations whose primary aim is to increase the predictive power of the Standard Model of elementary particle interactions through numerical simulation of Quantum Chromodynamics; carrying out state-of-the-art cosmological simulations, including the large-scale distribution of dark matter, the formation of dark matter haloes, the formation and evolution of galaxies and clusters, the physics of the intergalactic medium and the properties of the intracluster gas.
This grant is to support the continued operation of the DiRAC facilities until 2017 to ensure that the UK remains one of the world-leaders of theoretical modelling in particle physics, astronomy and cosmology.
Planned Impact
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
People |
ORCID iD |
Richard Kenway (Principal Investigator) | |
Peter Boyle (Co-Investigator) |
Publications
Rieder S
(2022)
The formation and early evolution of embedded star clusters in spiral galaxies
in Monthly Notices of the Royal Astronomical Society
Owen J
(2020)
Massive discs around low-mass stars
in Monthly Notices of the Royal Astronomical Society
Khachaturyants T
(2022)
Bending waves excited by irregular gas inflow along warps
in Monthly Notices of the Royal Astronomical Society
Borrow J
(2020)
Cosmological baryon transfer in the simba simulations
in Monthly Notices of the Royal Astronomical Society
Izquierdo A
(2021)
The Cloud Factory II: gravoturbulent kinematics of resolved molecular clouds in a galactic potential
in Monthly Notices of the Royal Astronomical Society
Vandenbroucke B
(2020)
Infrared luminosity functions and dust mass functions in the EAGLE simulation
in Monthly Notices of the Royal Astronomical Society
Elbers W
(2021)
An optimal non-linear method for simulating relic neutrinos
in Monthly Notices of the Royal Astronomical Society
Vijayan A
(2020)
First Light And Reionisation Epoch Simulations (FLARES) II: The Photometric Properties of High-Redshift Galaxies
in Monthly Notices of the Royal Astronomical Society
Hughes M
(2021)
What to expect when using globular clusters as tracers of the total mass distribution in Milky Way-mass galaxies
in Monthly Notices of the Royal Astronomical Society
Huško F
(2022)
Spin-driven jet feedback in idealized simulations of galaxy groups and clusters
in Monthly Notices of the Royal Astronomical Society
Mitchell P
(2022)
Baryonic mass budgets for haloes in the eagle simulation, including ejected and prevented gas
in Monthly Notices of the Royal Astronomical Society
Jackson T
(2020)
The star formation properties of the observed and simulated AGN Universe: BAT versus EAGLE
in Monthly Notices of the Royal Astronomical Society
Richardson M
(2020)
Simulating gas kinematic studies of high-redshift galaxies with the HARMONI integral field spectrograph
in Monthly Notices of the Royal Astronomical Society
Santos-Santos I
(2020)
Baryonic clues to the puzzling diversity of dwarf galaxy rotation curves
in Monthly Notices of the Royal Astronomical Society
Henden N
(2020)
The baryon content of groups and clusters of galaxies in the FABLE simulations
in Monthly Notices of the Royal Astronomical Society
Zarrouk P
(2022)
Preliminary clustering properties of the DESI BGS bright targets using DR9 Legacy Imaging Surveys
in Monthly Notices of the Royal Astronomical Society
Dutta R
(2021)
Metal-enriched halo gas across galaxy overdensities over the last 10 billion years
in Monthly Notices of the Royal Astronomical Society
Lee E
(2022)
A multisimulation study of relativistic SZ temperature scalings in galaxy clusters and groups
in Monthly Notices of the Royal Astronomical Society
Dobbs C
(2022)
The formation of massive stellar clusters in converging galactic flows with photoionization
in Monthly Notices of the Royal Astronomical Society
Keating L
(2020)
Long troughs in the Lyman-a forest below redshift 6 due to islands of neutral hydrogen
in Monthly Notices of the Royal Astronomical Society
Evans T
(2022)
Observing EAGLE galaxies with JWST : predictions for Milky Way progenitors and their building blocks
in Monthly Notices of the Royal Astronomical Society
Zheng Y
(2022)
Rapidly quenched galaxies in the Simba cosmological simulation and observations
in Monthly Notices of the Royal Astronomical Society
Kelly A
(2022)
Apostle-Auriga: effects of different subgrid models on the baryon cycle around Milky Way-mass galaxies
in Monthly Notices of the Royal Astronomical Society
Irodotou D
(2021)
Using angular momentum maps to detect kinematically distinct galactic components
in Monthly Notices of the Royal Astronomical Society
Kupilas M
(2021)
Interactions of a shock with a molecular cloud at various stages of its evolution due to thermal instability and gravity
in Monthly Notices of the Royal Astronomical Society
Description | In December 2009, the STFC Facility, DiRAC, was established to provide distributed High Performance Computing (HPC) services for theoretical modelling and HPC-based research in particle physics, astronomy and cosmology. DiRAC provides a variety of computer architectures, matching machine architecture to the algorithm design and requirements of the research problems to be solved. This grant funds the continued operation of the 1.3Pflop/s Blue Gene/Q system at the University of Edinburgh, which was co-developed by Peter Boyle (University of Edinburgh) and IBM to run with high energy efficiency for months at a time on a single problem to solve some of the most complex problems in physics, particularly the strong interactions of quarks and gluons. The DiRAC Facility supports over 250 active researchers at 27 UK HEIs. This includes the research projects of 100 funded research staff (PDRAs and Research Fellows), over 50 post-graduate projects, and £1.6M of funded research grants. |
Exploitation Route | Theoretical results obtained input to a range of experimental programmes aiming to increase our understanding of Nature. Algorithms and software developed provide input to computer design. |
Sectors | Digital/Communication/Information Technologies (including Software) |
URL | http://dirac.ac.uk/ |
Description | Intel IPAG QCD codesign project |
Organisation | Intel Corporation |
Department | Intel Corporation (Jones Farm) |
Country | United States |
Sector | Private |
PI Contribution | We have collaborated with Intel corporation since 2014 with $720k of total direct funding, starting initially as an Intel parallel computing centre, and expanding to direct close collaboration with Intel Pathfinding and Architecture Group. |
Collaborator Contribution | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. We are also working with Intel verifying future architectures that will deliver the exascale performance in 2021. |
Impact | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. This collaboration has been renewed annually in 2018, 2019, 2020. Two DiRAC RSE's were hired by Intel to work on the Turing collaboration. |
Start Year | 2016 |