DiRAC-2: Recurrent Costs for Complexity@DiRAC Cluster at University of Leicester
Lead Research Organisation:
University of Leicester
Department Name: Physics and Astronomy
Abstract
This award is for the recurrent costs of Complexity@DiRAC cluster at the the University of Leicester. It will cover electricity costs, support staff costs of the cluster which is part of the DiRAC-2 national facility.
Planned Impact
The pathways to impact for the project are as agreed at the DiRAC PMB meeting on 21 November 2011 and subsequently reported on in the annual reports of the facility.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
Publications
Harries T
(2019)
The TORUS radiation transfer code
in Astronomy and Computing
Sedda M
(2020)
The missing link in gravitational-wave astronomy: discoveries waiting in the decihertz range
in Classical and Quantum Gravity
Helfer T
(2022)
Malaise and remedy of binary boson-star initial data
in Classical and Quantum Gravity
Pandya A
(2022)
Dynamics of a nonminimally coupled scalar field in asymptotically AdS 4 spacetime
in Classical and Quantum Gravity
Horowitz G
(2019)
Creating a traversable wormhole
in Classical and Quantum Gravity
Adamek J
(2020)
Numerical solutions to Einstein's equations in a shearing-dust universe: a code comparison
in Classical and Quantum Gravity
Aurrekoetxea J
(2020)
Coherent gravitational waveforms and memory from cosmic string loops
in Classical and Quantum Gravity
Clough K
(2021)
Continuity equations for general matter: applications in numerical relativity
in Classical and Quantum Gravity
Radia M
(2022)
Lessons for adaptive mesh refinement in numerical relativity
in Classical and Quantum Gravity
Ripley J
(2022)
Computing the quasinormal modes and eigenfunctions for the Teukolsky equation using horizon penetrating, hyperboloidally compactified coordinates
in Classical and Quantum Gravity
Rosca-Mead R
(2019)
Inverse-chirp signals and spontaneous scalarisation with self-interacting potentials in stellar collapse
in Classical and Quantum Gravity
Barone T
(2024)
Gravitational lensing reveals cool gas within 10-20 kpc around a quiescent galaxy
in Communications Physics
Dehnen W
(2014)
A fast multipole method for stellar dynamics
in Computational Astrophysics and Cosmology
Tsang Y
(2020)
Characterising Jupiter's dynamo radius using its magnetic energy spectrum
in Earth and Planetary Science Letters
Hori K
(2019)
Anelastic torsional oscillations in Jupiter's metallic hydrogen region
in Earth and Planetary Science Letters
Tsang Y
(2020)
Characterising Jupiter's dynamo radius using its magnetic energy spectrum
in Earth and Planetary Science Letters
Spriggs T
(2022)
A comparison of spectral reconstruction methods applied to non-zero temperature NRQCD meson correlation functions
in EPJ Web of Conferences
Di Carlo M
(2022)
Electromagnetic finite-size effects beyond the point-like approximation
in EPJ Web of Conferences
Attanasio F
(2022)
Equation of state from complex Langevin simulations
in EPJ Web of Conferences
Skullerud J
(2022)
Hadrons at high temperature: An update from the FASTSUM collaboration
in EPJ Web of Conferences
Lawlor D
(2022)
Thermal Transitions in Dense Two-Colour QCD
in EPJ Web of Conferences
Changeat Q
(2022)
Disentangling atmospheric compositions of K2-18 b with next generation facilities.
in Experimental astronomy
Tinoco-Arenas A
(2022)
Parametric Study of Magnetosheath Jets in 2D Local Hybrid Simulations
in Frontiers in Astronomy and Space Sciences
Matteini L
(2020)
Magnetic Field Turbulence in the Solar Wind at Sub-ion Scales: In Situ Observations and Numerical Simulations
in Frontiers in Astronomy and Space Sciences
Trotta D
(2022)
Single-spacecraft techniques for shock parameters estimation: A systematic approach
in Frontiers in Astronomy and Space Sciences
Stevenson P
(2022)
Mean-field simulations of Es-254 + Ca-48 heavy-ion reactions
in Frontiers in Physics
Zavala J
(2019)
Dark Matter Haloes and Subhaloes
in Galaxies
Moliné Á
(2019)
Properties of Subhalos in the Interacting Dark Matter Scenario
in Galaxies
Rizzuti F
(2024)
Stellar Evolution and Convection in 3D Hydrodynamic Simulations of a Complete Burning Phase
in Galaxies
Barausse E
(2020)
Prospects for fundamental physics with LISA
in General Relativity and Gravitation
Gupta P
(2022)
A study of global magnetic helicity in self-consistent spherical dynamos
in Geophysical & Astrophysical Fluid Dynamics
Read P
(2020)
The turbulent dynamics of Jupiter's and Saturn's weather layers: order out of chaos?
in Geoscience Letters
Young R
(2019)
Simulating Jupiter's weather layer. Part II: Passive ammonia and water cycles
in Icarus
Irwin P
(2019)
Analysis of gaseous ammonia (NH3) absorption in the visible spectrum of Jupiter - Update
in Icarus
Raducan S
(2022)
Ejecta distribution and momentum transfer from oblique impacts on asteroid surfaces
in Icarus
Young R
(2019)
Simulating Jupiter's weather layer. Part I: Jet spin-up in a dry atmosphere
in Icarus
Young R
(2019)
Simulating Jupiter's weather layer. Part II: Passive ammonia and water cycles
in Icarus
Hardy F
(2023)
Estimating nosocomial infection and its outcomes in hospital patients in England with a diagnosis of COVID-19 using machine learning
in International Journal of Data Science and Analytics
Heyl J
(2023)
Data quality and autism: Issues and potential impacts.
in International journal of medical informatics
Worthy J
(2024)
Evaluation of the bilinear condensate of the planar Thirring model in the strongly coupled region
in International Journal of Modern Physics C
Stevenson P
(2020)
Internuclear potentials from the Sky3D code
in IOP SciNotes
Muia F
(2019)
The fate of dense scalar stars
in Journal of Cosmology and Astroparticle Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part I. Methodology and code description
in Journal of Cosmology and Astroparticle Physics
Almaraz E
(2020)
Nonlinear structure formation in Bound Dark Energy
in Journal of Cosmology and Astroparticle Physics
Aurrekoetxea J
(2020)
The effects of potential shape on inhomogeneous inflation
in Journal of Cosmology and Astroparticle Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part II. Initial conditions
in Journal of Cosmology and Astroparticle Physics
Balan S
(2025)
Resonant or asymmetric: the status of sub-GeV dark matter
in Journal of Cosmology and Astroparticle Physics
Givans J
(2022)
Non-linearities in the Lyman-a forest and in its cross-correlation with dark matter halos
in Journal of Cosmology and Astroparticle Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part I. Methodology and code description
in Journal of Cosmology and Astroparticle Physics
| Description | Many new discoveries about the formation and evolution of galaxies, star formation, planet formation have been made possible by the award. |
| Exploitation Route | Many international collaborative projects are supported by the HPC resources provided by DiRAC. |
| Sectors | Aerospace Defence and Marine Creative Economy Digital/Communication/Information Technologies (including Software) Education Manufacturing including Industrial Biotechology Retail Other |
| URL | http://www.dirac.ac.uk |
| Description | Significant co-design project with Hewlett-Packard Enterprise, including partnership in the HPE/Arm/Suse Catalyst UK programme. |
| First Year Of Impact | 2017 |
| Sector | Digital/Communication/Information Technologies (including Software) |
| Impact Types | Societal |
| Description | DiRAC 2.5x Project Office 2017-2020 |
| Amount | £300,000 (GBP) |
| Organisation | Science and Technologies Facilities Council (STFC) |
| Sector | Public |
| Country | United Kingdom |
| Start | 02/2018 |
| End | 03/2020 |
| Title | Citation analysys and Impact |
| Description | Use of IT to determineacademic impact of eInfrastructure |
| Type Of Material | Improvements to research infrastructure |
| Year Produced | 2017 |
| Provided To Others? | Yes |
| Impact | Understood emerging trends in DiRAC Science and helped decide the scale and type of IT investments and direct us to develop new technologies |
| URL | http://www.dirac.ac.uk |
| Title | Runaway gas accretion and ALMA observations |
| Description | VizieR online Data Catalogue associated with article published in journal Monthly Notices of the Royal Astronomical Society with title ' ALMA observations require slower Core Accretion runaway growth.' (bibcode: 2019MNRAS.488L..12N) |
| Type Of Material | Database/Collection of data |
| Year Produced | 2023 |
| Provided To Others? | Yes |
| URL | https://cdsarc.cds.unistra.fr/viz-bin/cat/J/MNRAS/488/L12 |
| Description | Co-design project with Hewlett Packard Enterprise |
| Organisation | Hewlett Packard Enterprise (HPE) |
| Country | United Kingdom |
| Sector | Private |
| PI Contribution | Technical support and operations costs for running the hardware. Research workflows to test the system performance, and investment of academic time and software engineering time to optimise code for new hardware. Project will explore suitability of hardware for DiRAC workflows and provide feedback to HPE. |
| Collaborator Contribution | In-kind provision of research computing hardware. Value is commercially confidential. |
| Impact | As this collaboration is about to commence, there are no outcomes to report at this point. |
| Start Year | 2018 |
| Description | DiRAC |
| Organisation | Science and Technologies Facilities Council (STFC) |
| Department | Distributed Research Utilising Advanced Computing |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | I am the PI for two research grants for the procurement and running of the Complexity@DiRAC High Performance Computing cluster at the University of Leicester. This cluster is now in active operation as a national HPC facility. |
| Collaborator Contribution | DiRAC is the facility which provides HPC resources for the theoretical astrophysics and particle physics communities within STFC. |
| Impact | The establishment and running of a new HPC cluster at the University of Leicester as part of the DiRAC national facility. |
| Start Year | 2011 |
| Description | Nuclei from Lattice QCD |
| Organisation | RIKEN |
| Department | RIKEN-Nishina Center for Accelerator-Based Science |
| Country | Japan |
| Sector | Public |
| PI Contribution | Surrey performed ab initio studies of LQCD-derived nuclear forces |
| Collaborator Contribution | Work by Prof. Hatsuda and collaborators at the iTHEMS and Quantum Hadron Physics Laboratory to provide nuclear forces derived from LQCD |
| Impact | Phys. Rev. C 97, 021303(R) |
| Start Year | 2015 |
| Description | STFC Centres for Doctoral Training in Data Intensive Science |
| Organisation | University of Leicester |
| Department | STFC DiRAC Complexity Cluster (HPC Facility Leicester) |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | Support for STFC Centres for Doctoral Training (CDT) in Data Intensive Science - DiRAC is a partner in five of the eight of the newly established STFC CDTs, and is actively engaged with them in developing industrial partnerships. DiRAC is also offering placements to CDT students interested in Research Software Engineering roles. |
| Collaborator Contribution | Students to work on interesting technical problems for DiRAC |
| Impact | This is the first year |
| Start Year | 2017 |