DiRAC-2: Recurrent Costs for Complexity@DiRAC Cluster at University of Leicester
Lead Research Organisation:
University of Leicester
Department Name: Physics and Astronomy
Abstract
This award is for the recurrent costs of Complexity@DiRAC cluster at the the University of Leicester. It will cover electricity costs, support staff costs of the cluster which is part of the DiRAC-2 national facility.
Planned Impact
The pathways to impact for the project are as agreed at the DiRAC PMB meeting on 21 November 2011 and subsequently reported on in the annual reports of the facility.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
Publications
Adamek J
(2020)
Numerical solutions to Einstein's equations in a shearing-dust universe: a code comparison
in Classical and Quantum Gravity
Pandya A
(2022)
Dynamics of a nonminimally coupled scalar field in asymptotically AdS 4 spacetime
in Classical and Quantum Gravity
Radia M
(2022)
Lessons for adaptive mesh refinement in numerical relativity
in Classical and Quantum Gravity
Horowitz G
(2019)
Creating a traversable wormhole
in Classical and Quantum Gravity
Dehnen W
(2014)
A fast multipole method for stellar dynamics
in Computational Astrophysics and Cosmology
Hori K
(2019)
Anelastic torsional oscillations in Jupiter's metallic hydrogen region
in Earth and Planetary Science Letters
Lawlor D
(2022)
Thermal Transitions in Dense Two-Colour QCD
in EPJ Web of Conferences
Skullerud J
(2022)
Hadrons at high temperature: An update from the FASTSUM collaboration
in EPJ Web of Conferences
Di Carlo M
(2022)
Electromagnetic finite-size effects beyond the point-like approximation
in EPJ Web of Conferences
Attanasio F
(2022)
Equation of state from complex Langevin simulations
in EPJ Web of Conferences
Spriggs T
(2022)
A comparison of spectral reconstruction methods applied to non-zero temperature NRQCD meson correlation functions
in EPJ Web of Conferences
Changeat Q
(2022)
Disentangling atmospheric compositions of K2-18 b with next generation facilities.
in Experimental astronomy
Trotta D
(2022)
Single-spacecraft techniques for shock parameters estimation: A systematic approach
in Frontiers in Astronomy and Space Sciences
Tinoco-Arenas A
(2022)
Parametric Study of Magnetosheath Jets in 2D Local Hybrid Simulations
in Frontiers in Astronomy and Space Sciences
Matteini L
(2020)
Magnetic Field Turbulence in the Solar Wind at Sub-ion Scales: In Situ Observations and Numerical Simulations
in Frontiers in Astronomy and Space Sciences
Stevenson P
(2022)
Mean-field simulations of Es-254 + Ca-48 heavy-ion reactions
in Frontiers in Physics
Moliné Á
(2019)
Properties of Subhalos in the Interacting Dark Matter Scenario
in Galaxies
Zavala J
(2019)
Dark Matter Haloes and Subhaloes
in Galaxies
Barausse E
(2020)
Prospects for fundamental physics with LISA
in General Relativity and Gravitation
Gupta P
(2022)
A study of global magnetic helicity in self-consistent spherical dynamos
in Geophysical & Astrophysical Fluid Dynamics
Read P
(2020)
The turbulent dynamics of Jupiter's and Saturn's weather layers: order out of chaos?
in Geoscience Letters
Young R
(2019)
Simulating Jupiter's weather layer. Part II: Passive ammonia and water cycles
in Icarus
Young R
(2019)
Simulating Jupiter's weather layer. Part II: Passive ammonia and water cycles
in Icarus
Raducan S
(2022)
Ejecta distribution and momentum transfer from oblique impacts on asteroid surfaces
in Icarus
Young R
(2019)
Simulating Jupiter's weather layer. Part I: Jet spin-up in a dry atmosphere
in Icarus
Hardy F
(2023)
Estimating nosocomial infection and its outcomes in hospital patients in England with a diagnosis of COVID-19 using machine learning
in International Journal of Data Science and Analytics
Heyl J
(2023)
Data quality and autism: Issues and potential impacts
in International Journal of Medical Informatics
Stevenson P
(2020)
Internuclear potentials from the Sky3D code
in IOP SciNotes
Macpherson H
(2023)
Cosmological distances with general-relativistic ray tracing: framework and comparison to cosmographic predictions
in Journal of Cosmology and Astroparticle Physics
Aurrekoetxea J
(2020)
The effects of potential shape on inhomogeneous inflation
in Journal of Cosmology and Astroparticle Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part I. Methodology and code description
in Journal of Cosmology and Astroparticle Physics
Muia F
(2019)
The fate of dense scalar stars
in Journal of Cosmology and Astroparticle Physics
Hernández-Aguayo C
(2022)
Fast full N-body simulations of generic modified gravity: derivative coupling models
in Journal of Cosmology and Astroparticle Physics
Widdicombe J
(2020)
Black hole formation in relativistic Oscillaton collisions
in Journal of Cosmology and Astroparticle Physics
Aviles A
(2020)
Marked correlation functions in perturbation theory
in Journal of Cosmology and Astroparticle Physics
Heinesen A
(2022)
A prediction for anisotropies in the nearby Hubble flow
in Journal of Cosmology and Astroparticle Physics
Clarke P
(2021)
Probing inflation with precision bispectra
in Journal of Cosmology and Astroparticle Physics
Pedersen C
(2020)
Massive neutrinos and degeneracies in Lyman-alpha forest simulations
in Journal of Cosmology and Astroparticle Physics
Becker C
(2020)
Proca-stinated cosmology. Part I. A N -body code for the vector Galileon
in Journal of Cosmology and Astroparticle Physics
De Jong E
(2022)
Primordial black hole formation with full numerical relativity
in Journal of Cosmology and Astroparticle Physics
Pedersen C
(2021)
An emulator for the Lyman-a forest in beyond-?CDM cosmologies
in Journal of Cosmology and Astroparticle Physics
Leo M
(2020)
Constraining structure formation using EDGES
in Journal of Cosmology and Astroparticle Physics
Shiraishi M
(2019)
General modal estimation for cross-bispectra
in Journal of Cosmology and Astroparticle Physics
Almaraz E
(2020)
Nonlinear structure formation in Bound Dark Energy
in Journal of Cosmology and Astroparticle Physics
Givans J
(2022)
Non-linearities in the Lyman-a forest and in its cross-correlation with dark matter halos
in Journal of Cosmology and Astroparticle Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part II. Initial conditions
in Journal of Cosmology and Astroparticle Physics
Bozorgnia N
(2019)
On the correlation between the local dark matter and stellar velocities
in Journal of Cosmology and Astroparticle Physics
Poole-McKenzie R
(2020)
Informing dark matter direct detection limits with the ARTEMIS simulations
in Journal of Cosmology and Astroparticle Physics
Bozorgnia N
(2020)
The dark matter component of the Gaia radially anisotropic substructure
in Journal of Cosmology and Astroparticle Physics
Nazari Z
(2021)
Oscillon collapse to black holes
in Journal of Cosmology and Astroparticle Physics
Description | Many new discoveries about the formation and evolution of galaxies, star formation, planet formation have been made possible by the award. |
Exploitation Route | Many international collaborative projects are supported by the HPC resources provided by DiRAC. |
Sectors | Aerospace, Defence and Marine,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Manufacturing, including Industrial Biotechology,Retail,Other |
URL | http://www.dirac.ac.uk |
Description | Significant co-design project with Hewlett-Packard Enterprise, including partnership in the HPE/Arm/Suse Catalyst UK programme. |
First Year Of Impact | 2017 |
Sector | Digital/Communication/Information Technologies (including Software) |
Impact Types | Societal |
Description | DiRAC 2.5x Project Office 2017-2020 |
Amount | £300,000 (GBP) |
Organisation | Science and Technologies Facilities Council (STFC) |
Sector | Public |
Country | United Kingdom |
Start | 02/2018 |
End | 03/2020 |
Title | Citation analysys and Impact |
Description | Use of IT to determineacademic impact of eInfrastructure |
Type Of Material | Improvements to research infrastructure |
Year Produced | 2017 |
Provided To Others? | Yes |
Impact | Understood emerging trends in DiRAC Science and helped decide the scale and type of IT investments and direct us to develop new technologies |
URL | http://www.dirac.ac.uk |
Description | Co-design project with Hewlett Packard Enterprise |
Organisation | Hewlett Packard Enterprise (HPE) |
Country | United Kingdom |
Sector | Private |
PI Contribution | Technical support and operations costs for running the hardware. Research workflows to test the system performance, and investment of academic time and software engineering time to optimise code for new hardware. Project will explore suitability of hardware for DiRAC workflows and provide feedback to HPE. |
Collaborator Contribution | In-kind provision of research computing hardware. Value is commercially confidential. |
Impact | As this collaboration is about to commence, there are no outcomes to report at this point. |
Start Year | 2018 |
Description | Nuclei from Lattice QCD |
Organisation | RIKEN |
Department | RIKEN-Nishina Center for Accelerator-Based Science |
Country | Japan |
Sector | Public |
PI Contribution | Surrey performed ab initio studies of LQCD-derived nuclear forces |
Collaborator Contribution | Work by Prof. Hatsuda and collaborators at the iTHEMS and Quantum Hadron Physics Laboratory to provide nuclear forces derived from LQCD |
Impact | Phys. Rev. C 97, 021303(R) |
Start Year | 2015 |
Description | STFC Centres for Doctoral Training in Data Intensive Science |
Organisation | University of Leicester |
Department | STFC DiRAC Complexity Cluster (HPC Facility Leicester) |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Support for STFC Centres for Doctoral Training (CDT) in Data Intensive Science - DiRAC is a partner in five of the eight of the newly established STFC CDTs, and is actively engaged with them in developing industrial partnerships. DiRAC is also offering placements to CDT students interested in Research Software Engineering roles. |
Collaborator Contribution | Students to work on interesting technical problems for DiRAC |
Impact | This is the first year |
Start Year | 2017 |