Dirac 2.5 Operations
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Physics and Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed
to support cutting edge research during 2017 in all areas of science supported by STFC.
DiRAC-2.5 will provide maintain the existing DiRAC-2 services from April 2017, and also provide and increase in computational
resources at Durham, Cambridge and Leicester.
This grant will support the operation of the Edinburgh DiRAC services, which presently comprise
98384 operational computing cores serving around 80% of DiRAC computing cycles. The system is made up
from both the original 1.26PFlop/s DiRAC BlueGene/Q system and, following a recent transfer
to Edinburgh by STFC, six racks of the Hartree BlueJoule supercomputer.
The DiRAC project also will offer a team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed
to support cutting edge research during 2017 in all areas of science supported by STFC.
DiRAC-2.5 will provide maintain the existing DiRAC-2 services from April 2017, and also provide and increase in computational
resources at Durham, Cambridge and Leicester.
This grant will support the operation of the Edinburgh DiRAC services, which presently comprise
98384 operational computing cores serving around 80% of DiRAC computing cycles. The system is made up
from both the original 1.26PFlop/s DiRAC BlueGene/Q system and, following a recent transfer
to Edinburgh by STFC, six racks of the Hartree BlueJoule supercomputer.
The DiRAC project also will offer a team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Planned Impact
The expected impact of the DiRAC 2.5 HPC facility is fully described in the attached pathways to impact document and includes:
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
Publications
Cataldi P
(2022)
Fingerprints of modified gravity on galaxies in voids
in Monthly Notices of the Royal Astronomical Society
Sawala T
(2023)
The Local Group's mass: probably no more than the sum of its parts
in Monthly Notices of the Royal Astronomical Society
Irodotou D
(2022)
The effects of AGN feedback on the structural and dynamical properties of Milky Way-mass galaxies in cosmological simulations
in Monthly Notices of the Royal Astronomical Society
Grove C
(2022)
The DESI N -body simulation project - I. Testing the robustness of simulations for the DESI dark time survey
in Monthly Notices of the Royal Astronomical Society
Vidal J
(2020)
Efficiency of tidal dissipation in slowly rotating fully convective stars or planets
in Monthly Notices of the Royal Astronomical Society
Ghosh S
(2022)
Age dissection of the vertical breathing motions in Gaia DR2: evidence for spiral driving
in Monthly Notices of the Royal Astronomical Society
Van Son L
(2019)
Galaxies with monstrous black holes in galaxy cluster environments
in Monthly Notices of the Royal Astronomical Society
Haworth T
(2020)
The observational anatomy of externally photoevaporating planet-forming discs - I. Atomic carbon
in Monthly Notices of the Royal Astronomical Society
Chaikin E
(2022)
The importance of the way in which supernova energy is distributed around young stellar populations in simulations of galaxies
in Monthly Notices of the Royal Astronomical Society
Haworth T
(2021)
Warm millimetre dust in protoplanetary discs near massive stars
in Monthly Notices of the Royal Astronomical Society
Schreyer E
(2024)
Using Ly a transits to constrain models of atmospheric escape
in Monthly Notices of the Royal Astronomical Society
Brown S
(2020)
Connecting the structure of dark matter haloes to the primordial power spectrum
in Monthly Notices of the Royal Astronomical Society
Lovell M
(2021)
The spatial distribution of Milky Way satellites, gaps in streams, and the nature of dark matter
in Monthly Notices of the Royal Astronomical Society
Stafford S
(2020)
Exploring extensions to the standard cosmological model and the impact of baryons on small scales
in Monthly Notices of the Royal Astronomical Society
Scott L
(2021)
Convective core entrainment in 1D main-sequence stellar models
in Monthly Notices of the Royal Astronomical Society
Wilson B
(2022)
A measurement of the Ly ß forest power spectrum and its cross with the Ly a forest in X-Shooter XQ-100
in Monthly Notices of the Royal Astronomical Society
Santos-Santos I
(2023)
The Tucana dwarf spheroidal: a distant backsplash galaxy of M31?
in Monthly Notices of the Royal Astronomical Society
Robertson A
(2020)
Mapping dark matter and finding filaments: calibration of lensing analysis techniques on simulated data
in Monthly Notices of the Royal Astronomical Society
Davies C
(2022)
Cosmological forecasts with the clustering of weak lensing peaks
in Monthly Notices of the Royal Astronomical Society
Nightingale J
(2024)
Scanning for dark matter subhaloes in Hubble Space Telescope imaging of 54 strong lenses
in Monthly Notices of the Royal Astronomical Society
Anderson S
(2022)
The secular growth of bars revealed by flat (peak + shoulders) density profiles
in Monthly Notices of the Royal Astronomical Society
Puchwein E
(2023)
The Sherwood-Relics simulations: overview and impact of patchy reionization and pressure smoothing on the intergalactic medium
in Monthly Notices of the Royal Astronomical Society
Genina A
(2023)
On the edge: the relation between stellar and dark matter haloes of Milky Way-mass galaxies
in Monthly Notices of the Royal Astronomical Society
Vera-Casanova A
(2022)
Linking the brightest stellar streams with the accretion history of Milky Way like galaxies
in Monthly Notices of the Royal Astronomical Society
He Q
(2023)
Testing strong lensing subhalo detection with a cosmological simulation
in Monthly Notices of the Royal Astronomical Society
Rodríguez Montero F
(2024)
The impact of cosmic rays on the interstellar medium and galactic outflows of Milky Way analogues
in Monthly Notices of the Royal Astronomical Society
Bilimogga P
(2022)
Using eagle simulations to study the effect of observational constraints on the determination of H i asymmetries in galaxies
in Monthly Notices of the Royal Astronomical Society
Dutta R
(2021)
Metal-enriched halo gas across galaxy overdensities over the last 10 billion years
in Monthly Notices of the Royal Astronomical Society
Dumitru S
(2019)
Predictions and sensitivity forecasts for reionization-era [C ii ] line intensity mapping
in Monthly Notices of the Royal Astronomical Society
Becker G
(2021)
The mean free path of ionizing photons at 5 < z < 6: evidence for rapid evolution near reionization
in Monthly Notices of the Royal Astronomical Society
Deason A
(2020)
The edge of the Galaxy
in Monthly Notices of the Royal Astronomical Society
Huško F
(2023)
The complex interplay of AGN jet-inflated bubbles and the intracluster medium
in Monthly Notices of the Royal Astronomical Society
Kruijssen J
(2019)
The formation and assembly history of the Milky Way revealed by its globular cluster population
in Monthly Notices of the Royal Astronomical Society
Lee E
(2022)
A multisimulation study of relativistic SZ temperature scalings in galaxy clusters and groups
in Monthly Notices of the Royal Astronomical Society
Stafford S
(2021)
Testing extensions to ?CDM on small scales with forthcoming cosmic shear surveys
in Monthly Notices of the Royal Astronomical Society
Igoshev A
(2023)
Three-dimensional magnetothermal evolution of off-centred dipole magnetic field configurations in neutron stars
in Monthly Notices of the Royal Astronomical Society
Owens A
(2024)
ExoMol line lists - LI. Molecular line lists for lithium hydroxide (LiOH)
in Monthly Notices of the Royal Astronomical Society
Vandenbroucke B
(2020)
Infrared luminosity functions and dust mass functions in the EAGLE simulation
in Monthly Notices of the Royal Astronomical Society
Huško F
(2023)
The buildup of galaxies and their spheroids: The contributions of mergers, disc instabilities, and star formation
in Monthly Notices of the Royal Astronomical Society
Borukhovetskaya A
(2022)
The tidal evolution of the Fornax dwarf spheroidal and its globular clusters
in Monthly Notices of the Royal Astronomical Society
Wilkins S
(2022)
First Light and Reionisation Epoch Simulations (FLARES) - VI. The colour evolution of galaxies z = 5-15
in Monthly Notices of the Royal Astronomical Society
Gratton S
(2020)
Understanding parameter differences between analyses employing nested data subsets
in Monthly Notices of the Royal Astronomical Society
Bolton J
(2022)
Limits on non-canonical heating and turbulence in the intergalactic medium from the low redshift Lyman a forest
in Monthly Notices of the Royal Astronomical Society
Iyer K
(2020)
The diversity and variability of star formation histories in models of galaxy evolution
in Monthly Notices of the Royal Astronomical Society
Owen J
(2020)
Testing exoplanet evaporation with multitransiting systems
in Monthly Notices of the Royal Astronomical Society
Tam S
(2020)
The distribution of dark matter and gas spanning 6 Mpc around the post-merger galaxy cluster MS 0451-03
in Monthly Notices of the Royal Astronomical Society
Rey M
(2024)
Boosting galactic outflows with enhanced resolution
in Monthly Notices of the Royal Astronomical Society
Fancher J
(2023)
On the relative importance of shocks and self-gravity in modifying tidal disruption event debris streams
in Monthly Notices of the Royal Astronomical Society
Kelly A
(2021)
The origin of X-ray coronae around simulated disc galaxies
in Monthly Notices of the Royal Astronomical Society
Kong S
(2022)
Filament formation via collision-induced magnetic reconnection - formation of a star cluster
in Monthly Notices of the Royal Astronomical Society
| Description | DiRAC 2.5 is a facility to support leading-edge computational astronomy and particle physics in the UK. This has resulted in over 1000 peer-reviewed publications. Many new discoveries about the formation and evolution of galaxies, star formation, planet formation and particle physics theory have been made possible by the award |
| Exploitation Route | Build on the scientific knowledge and computational techniques developed. Many international collaborative projects are supported by the HPC resources provided by DiRAC. |
| Sectors | Aerospace Defence and Marine Creative Economy Digital/Communication/Information Technologies (including Software) Education Healthcare |
| URL | http://www.dirac.ac.uk |
| Description | A close working relationship on co-design of hardware and software. |
| First Year Of Impact | 2015 |
| Sector | Digital/Communication/Information Technologies (including Software),Education |
| Impact Types | Economic |
| Title | Lattice dataset for the paper arXiv:2202.08795 "Simulating rare kaon decays using domain wall lattice QCD with physical light quark masses" |
| Description | Release for https://arxiv.org/abs/2202.08795 |
| Type Of Material | Database/Collection of data |
| Year Produced | 2022 |
| Provided To Others? | Yes |
| URL | https://zenodo.org/record/6369178 |
| Title | Symplectic lattice gauge theories on Grid: approaching the conformal window---data release |
| Description | This is the data release relative to the paper "Symplectic lattice gauge theories on Grid: approaching the conformal window" (arXiv:2306.11649). It contains pre-analysed data that can be plotted, and raw data that can be analysed and plotted through the analysis code in doi:10.5281/zenodo.8136514. |
| Type Of Material | Database/Collection of data |
| Year Produced | 2023 |
| Provided To Others? | Yes |
| URL | https://zenodo.org/record/8136452 |
| Title | Symplectic lattice gauge theories on Grid: approaching the conformal window---data release |
| Description | This is the data release relative to the paper "Symplectic lattice gauge theories on Grid: approaching the conformal window" (arXiv:2306.11649). It contains pre-analysed data that can be plotted, and raw data that can be analysed and plotted through the analysis code in doi:10.5281/zenodo.8136514. |
| Type Of Material | Database/Collection of data |
| Year Produced | 2023 |
| Provided To Others? | Yes |
| URL | https://zenodo.org/record/8136451 |
| Description | Intel IPAG QCD codesign project |
| Organisation | Intel Corporation |
| Department | Intel Corporation (Jones Farm) |
| Country | United States |
| Sector | Private |
| PI Contribution | We have collaborated with Intel corporation since 2014 with $720k of total direct funding, starting initially as an Intel parallel computing centre, and expanding to direct close collaboration with Intel Pathfinding and Architecture Group. |
| Collaborator Contribution | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. We are also working with Intel verifying future architectures that will deliver the exascale performance in 2021. |
| Impact | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. This collaboration has been renewed annually in 2018, 2019, 2020. Two DiRAC RSE's were hired by Intel to work on the Turing collaboration. |
| Start Year | 2016 |
| Title | FP16-S7E8 MIXED PRECISION FOR DEEP LEARNING AND OTHER ALGORITHMS |
| Description | We demonstrated that a new non-IEEE 16 bit floating point format is the optimal choice for machine learning training and proposed instructions. |
| IP Reference | US20190042544 |
| Protection | Patent application published |
| Year Protection Granted | 2019 |
| Licensed | Yes |
| Impact | We demonstrated that a new non-IEEE 16 bit floating point format is the optimal choice for machine learning training and proposed instructions. Intel filed this with US patent office. This IP is owned by Intel under the terms of the Intel Turing strategic partnership contract. As a co-inventor I have been named on the patent application. The proposed format has been announced as planned for use in future Intel architectures. This collaboration with Turing emerged out of an investment in Edinburgh by Intel Pathfinding and Architecture Group in codesign with lattice gauge theory simulations. Intel hired DiRAC RSE's Kashyap and Lepper and placed them in Edinburgh to work with me on Machine Learning codesign through the Turing programme. |
| Title | Symplectic lattice gauge theories on Grid: approaching the conformal window-analysis code |
| Description | This is the analysis code that has been used to analyse and plot the data for the paper 'Symplectic lattice gauge theories on Grid: approaching the conformal window' (arXiv:2306.11649). |
| Type Of Technology | Software |
| Year Produced | 2023 |
| Open Source License? | Yes |
| URL | https://zenodo.org/record/8136513 |
