Dirac 2.5 Operations
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Physics and Astronomy
Abstract
Physicists across the astronomy, nuclear and particle physics communities are focussed
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed
to support cutting edge research during 2017 in all areas of science supported by STFC.
DiRAC-2.5 will provide maintain the existing DiRAC-2 services from April 2017, and also provide and increase in computational
resources at Durham, Cambridge and Leicester.
This grant will support the operation of the Edinburgh DiRAC services, which presently comprise
98384 operational computing cores serving around 80% of DiRAC computing cycles. The system is made up
from both the original 1.26PFlop/s DiRAC BlueGene/Q system and, following a recent transfer
to Edinburgh by STFC, six racks of the Hartree BlueJoule supercomputer.
The DiRAC project also will offer a team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
on understanding how the Universe works at a very fundamental level. The distance scales
with which they work vary by 50 orders of magnitude from the smallest distances probed
by experiments at the Large Hadron Collider, deep within the atomic
nucleus, to the largest scale galaxy clusters discovered out in space. The Science challenges,
however, are linked through questions such as: How did the Universe begin and how is it evolving?
and What are the fundamental constituents and fabric of the Universe and how do they interact?
Progress requires new astronomical observations and experimental data but also
new theoretical insights. Theoretical understanding comes increasingly from large-scale
computations that allow us to confront the consequences of our theories very accurately
with the data or allow us to interrogate the data in detail to extract information that has
impact on our theories. These computations test the fastest computers that we have and
push the boundaries of technology in this sector. They also provide an excellent
environment for training students in state-of-the-art techniques for code optimisation and
data mining and visualisation.
The DiRAC-2.5 project builds on the success of the DiRAC HPC facility and will provide the resources needed
to support cutting edge research during 2017 in all areas of science supported by STFC.
DiRAC-2.5 will provide maintain the existing DiRAC-2 services from April 2017, and also provide and increase in computational
resources at Durham, Cambridge and Leicester.
This grant will support the operation of the Edinburgh DiRAC services, which presently comprise
98384 operational computing cores serving around 80% of DiRAC computing cycles. The system is made up
from both the original 1.26PFlop/s DiRAC BlueGene/Q system and, following a recent transfer
to Edinburgh by STFC, six racks of the Hartree BlueJoule supercomputer.
The DiRAC project also will offer a team of three research software engineers who will help DiRAC researchers to ensure their scientific codes to extract
the best possible performance from the hardware components of the DiRAC clusters. These highly skilled programmers will
increase the effective computational power of the DiRAC facility during 2017.
Planned Impact
The expected impact of the DiRAC 2.5 HPC facility is fully described in the attached pathways to impact document and includes:
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
1) Disseminating best practice in High Performance Computing software engineering throughout the theoretical Particle Physics, Astronomy and Nuclear physics communities in the UK as well as to industry partners.
2) Working on co-design projects with industry partners to improve future generations of hardware and software.
3) Development of new techniques in the area of High Performance Data Analytics which will benefit industry partners and researchers in other fields such as biomedicine, biology, engineering, economics and social science, and the natural environment who can use this new technology to improve research outcomes in their areas.
4) Share best practice on the design and operation of distributed HPC facilities with UK National e-Infrastructure partners.
5) Training of the next generation of research scientists of physical scientists to tackle problems effectively on state-of-the-art of High Performance Computing facilities. Such skills are much in demand from high-tech industry.
6) Engagement with the general public to promote interest in science, and to explain how our ability to solve complex problems using the latest computer technology leads to new scientific capabilities/insights. Engagement of this kind also naturally encourages the uptake of STEM subjects in schools.
Publications
Figueras P
(2020)
Gravitational collapse in cubic Horndeski theories
in Classical and Quantum Gravity
Bozorgnia N
(2020)
The dark matter component of the Gaia radially anisotropic substructure
in Journal of Cosmology and Astroparticle Physics
Woss A
(2020)
Efficient solution of the multichannel Lüscher determinant condition through eigenvalue decomposition
in Physical Review D
Yurchenko S
(2020)
ExoMol line lists - XXXIX. Ro-vibrational molecular line list for CO2
in Monthly Notices of the Royal Astronomical Society
Gaikwad P
(2020)
Probing the thermal state of the intergalactic medium at z > 5 with the transmission spikes in high-resolution Ly a forest spectra
in Monthly Notices of the Royal Astronomical Society
Hassan S
(2020)
Testing galaxy formation simulations with damped Lyman-a abundance and metallicity evolution
in Monthly Notices of the Royal Astronomical Society
Coulton W
(2020)
Weak lensing minima and peaks: Cosmological constraints and the impact of baryons
in Monthly Notices of the Royal Astronomical Society
Harrison J
(2020)
B c ? J / ? form factors for the full q 2 range from lattice QCD
in Physical Review D
Offler S
(2020)
News from bottomonium spectral functions in thermal QCD
Nealon R
(2020)
Spirals, shadows & precession in HD 100453 - II. The hidden companion
in Monthly Notices of the Royal Astronomical Society
Lovell M
(2020)
Local group star formation in warm and self-interacting dark matter cosmologies
in Monthly Notices of the Royal Astronomical Society
Van Daalen M
(2020)
Exploring the effects of galaxy formation on matter clustering through a library of simulation power spectra
in Monthly Notices of the Royal Astronomical Society
Gómez-Guijarro C
(2020)
How primordial magnetic fields shrink galaxies
in Monthly Notices of the Royal Astronomical Society
Adamek J
(2020)
Numerical solutions to Einstein's equations in a shearing-dust universe: a code comparison
in Classical and Quantum Gravity
Patsourakos S
(2020)
Decoding the Pre-Eruptive Magnetic Field Configurations of Coronal Mass Ejections
in Space Science Reviews
Griffin A
(2020)
AGNs at the cosmic dawn: predictions for future surveys from a ?CDM cosmological model
in Monthly Notices of the Royal Astronomical Society
Porth L
(2020)
Fast estimation of aperture mass statistics - I. Aperture mass variance and an application to the CFHTLenS data
in Monthly Notices of the Royal Astronomical Society
Gratton S
(2020)
Understanding parameter differences between analyses employing nested data subsets
in Monthly Notices of the Royal Astronomical Society
Despali G
(2020)
The lensing properties of subhaloes in massive elliptical galaxies in sterile neutrino cosmologies
in Monthly Notices of the Royal Astronomical Society
Bennett E
(2020)
Color dependence of tensor and scalar glueball masses in Yang-Mills theories
in Physical Review D
Gonzalez-Perez V
(2020)
Do model emission line galaxies live in filaments at z ~ 1?
in Monthly Notices of the Royal Astronomical Society
Li B
(2020)
Measuring the baryon acoustic oscillation peak position with different galaxy selections
in Monthly Notices of the Royal Astronomical Society
Cooper L
(2020)
$B_c \to B_{s(d)}$ form factors
He J
(2020)
Modelling the tightest relation between galaxy properties and dark matter halo properties from hydrodynamical simulations of galaxy formation
in Monthly Notices of the Royal Astronomical Society
Font A
(2020)
The artemis simulations: stellar haloes of Milky Way-mass galaxies
in Monthly Notices of the Royal Astronomical Society
Oppenheimer B
(2020)
Feedback from supermassive black holes transforms centrals into passive galaxies by ejecting circumgalactic gas
in Monthly Notices of the Royal Astronomical Society
Ploeckinger S
(2020)
Radiative cooling rates, ion fractions, molecule abundances, and line emissivities including self-shielding and both local and metagalactic radiation fields
in Monthly Notices of the Royal Astronomical Society
Poole-McKenzie R
(2020)
Informing dark matter direct detection limits with the ARTEMIS simulations
in Journal of Cosmology and Astroparticle Physics
Debattista V
(2020)
Box/peanut-shaped bulges in action space
in Monthly Notices of the Royal Astronomical Society
Amarante J
(2020)
The Splash without a Merger
in The Astrophysical Journal
Kegerreis J
(2020)
Atmospheric Erosion by Giant Impacts onto Terrestrial Planets
in The Astrophysical Journal
Yurchenko S
(2020)
ExoMol line lists - XXXVIII. High-temperature molecular line list of silicon dioxide (SiO2)
in Monthly Notices of the Royal Astronomical Society
Widdicombe J
(2020)
Black hole formation in relativistic Oscillaton collisions
in Journal of Cosmology and Astroparticle Physics
Cuesta-Lazaro C
(2020)
Towards a non-Gaussian model of redshift space distortions
in Monthly Notices of the Royal Astronomical Society
Liow K
(2020)
The role of collision speed, cloud density, and turbulence in the formation of young massive clusters via cloud-cloud collisions
in Monthly Notices of the Royal Astronomical Society
Xu W
(2020)
Galaxy properties in the cosmic web of EAGLE simulation
in Monthly Notices of the Royal Astronomical Society
Brown S
(2020)
Connecting the structure of dark matter haloes to the primordial power spectrum
in Monthly Notices of the Royal Astronomical Society
Yurchenko S
(2020)
ExoMol line lists - XL. Rovibrational molecular line list for the hydronium ion (H3O+)
in Monthly Notices of the Royal Astronomical Society
Edwards B
(2020)
Hubble WFC3 Spectroscopy of the Habitable-zone Super-Earth LHS 1140 b
in The Astronomical Journal
Wurster J
(2020)
Non-ideal magnetohydrodynamics versus turbulence - I. Which is the dominant process in protostellar disc formation?
in Monthly Notices of the Royal Astronomical Society
Mitchell P
(2020)
Galactic outflow rates in the EAGLE simulations
in Monthly Notices of the Royal Astronomical Society
Bennett J
(2020)
Resolving shocks and filaments in galaxy formation simulations: effects on gas properties and star formation in the circumgalactic medium
in Monthly Notices of the Royal Astronomical Society
Cooke R
(2020)
The ACCELERATION programme: I. Cosmology with the redshift drift
in Monthly Notices of the Royal Astronomical Society
Miles P
(2020)
Fallback Rates from Partial Tidal Disruption Events
in The Astrophysical Journal
Robson D
(2020)
X-ray emission from hot gas in galaxy groups and clusters in simba
in Monthly Notices of the Royal Astronomical Society
Soler J
(2020)
The history of dynamics and stellar feedback revealed by the H I filamentary structure in the disk of the Milky Way
in Astronomy & Astrophysics
Creci G
(2020)
Evolution of black hole shadows from superradiance
in Physical Review D
Stafford S
(2020)
Exploring extensions to the standard cosmological model and the impact of baryons on small scales
in Monthly Notices of the Royal Astronomical Society
Lee J
(2020)
Dual Effects of Ram Pressure on Star Formation in Multiphase Disk Galaxies with Strong Stellar Feedback
in The Astrophysical Journal
Pluriel W
(2020)
ARES. III. Unveiling the Two Faces of KELT-7 b with HST WFC3
in The Astronomical Journal
Description | DiRAC 2.5 is a facility to support leading-edge computational astronomy and particle physics in the UK. This has resulted in over 500 peer-reviewed publications. |
Exploitation Route | Build on the scientific knowledge and computational techniques developed. |
Sectors | Digital/Communication/Information Technologies (including Software),Education |
Description | Intel IPAG QCD codesign project |
Organisation | Intel Corporation |
Department | Intel Corporation (Jones Farm) |
Country | United States |
Sector | Private |
PI Contribution | We have collaborated with Intel corporation since 2014 with $720k of total direct funding, starting initially as an Intel parallel computing centre, and expanding to direct close collaboration with Intel Pathfinding and Architecture Group. |
Collaborator Contribution | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. We are also working with Intel verifying future architectures that will deliver the exascale performance in 2021. |
Impact | We have performed detailed optimisation of QCD codes (Wilson, Domain Wall, Staggered) on Intel many core architectures. We have investigated the memory system and interconnect performance, particularly on Intel's latest interconnect hardware called Omnipath. We found serious performance issues and worked with Intel to plan a solution and this has been verified and is available as beta software. It will reach general availability in the Intel MPI 2019 release, and allow threaded concurrent communications in MPI for the first time. A joint paper on the resolution to this was written with the Intel MPI team, and the application of the same QCD programming techniques to machine learning gradient reduction was applied in the paper to the Baidu Research all reduce library, demonstrating a 10x gain for this critical step in machine learning in clustered environments. This collaboration has been renewed annually in 2018, 2019, 2020. Two DiRAC RSE's were hired by Intel to work on the Turing collaboration. |
Start Year | 2016 |
Title | FP16-S7E8 MIXED PRECISION FOR DEEP LEARNING AND OTHER ALGORITHMS |
Description | We demonstrated that a new non-IEEE 16 bit floating point format is the optimal choice for machine learning training and proposed instructions. |
IP Reference | US20190042544 |
Protection | Patent application published |
Year Protection Granted | 2019 |
Licensed | Yes |
Impact | We demonstrated that a new non-IEEE 16 bit floating point format is the optimal choice for machine learning training and proposed instructions. Intel filed this with US patent office. This IP is owned by Intel under the terms of the Intel Turing strategic partnership contract. As a co-inventor I have been named on the patent application. The proposed format has been announced as planned for use in future Intel architectures. This collaboration with Turing emerged out of an investment in Edinburgh by Intel Pathfinding and Architecture Group in codesign with lattice gauge theory simulations. Intel hired DiRAC RSE's Kashyap and Lepper and placed them in Edinburgh to work with me on Machine Learning codesign through the Turing programme. |