DiRAC-2: Recurrent Costs for Complexity@DiRAC Cluster at University of Leicester
Lead Research Organisation:
University of Leicester
Department Name: Physics and Astronomy
Abstract
This award is for the recurrent costs of Complexity@DiRAC cluster at the the University of Leicester. It will cover electricity costs, support staff costs of the cluster which is part of the DiRAC-2 national facility.
Planned Impact
The pathways to impact for the project are as agreed at the DiRAC PMB meeting on 21 November 2011 and subsequently reported on in the annual reports of the facility.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
Publications
Mitchell M
(2019)
A general framework to test gravity using galaxy clusters II: A universal model for the halo concentration in f(R) gravity
in Monthly Notices of the Royal Astronomical Society
Mitchell M
(2021)
A general framework to test gravity using galaxy clusters III: observable-mass scaling relations in f ( R ) gravity
in Monthly Notices of the Royal Astronomical Society
Mitchell M
(2021)
A general framework to test gravity using galaxy clusters IV: cluster and halo properties in DGP gravity
in Monthly Notices of the Royal Astronomical Society
Richings J
(2021)
A high-resolution cosmological simulation of a strong gravitational lens
in Monthly Notices of the Royal Astronomical Society
Bowesman CA
(2024)
A hyperfine-resolved spectroscopic model for vanadium monoxide (51V16O).
in Molecular physics
Viallet M
(2016)
A Jacobian-free Newton-Krylov method for time-implicit multidimensional hydrodynamics Physics-based preconditioning for sound waves and thermal diffusion
in Astronomy & Astrophysics
Goyal J
(2020)
A library of self-consistent simulated exoplanet atmospheres
in Monthly Notices of the Royal Astronomical Society
Smith A
(2022)
A light-cone catalogue from the Millennium-XXL simulation: improved spatial interpolation and colour distributions for the DESI BGS
in Monthly Notices of the Royal Astronomical Society
Koudmani S
(2021)
A little FABLE: exploring AGN feedback in dwarf galaxies with cosmological simulations
in Monthly Notices of the Royal Astronomical Society
Lovell C
(2022)
A machine learning approach to mapping baryons on to dark matter haloes using the eagle and C-EAGLE simulations
in Monthly Notices of the Royal Astronomical Society
Wilson B
(2022)
A measurement of the Ly ß forest power spectrum and its cross with the Ly a forest in X-Shooter XQ-100
in Monthly Notices of the Royal Astronomical Society
Pezzella M
(2021)
A method for calculating temperature-dependent photodissociation cross sections and rates.
in Physical chemistry chemical physics : PCCP
Wyper P
(2024)
A Model for Flux Rope Formation and Disconnection in Pseudostreamer Coronal Mass Ejections
in The Astrophysical Journal
Lee E
(2022)
A multisimulation study of relativistic SZ temperature scalings in galaxy clusters and groups
in Monthly Notices of the Royal Astronomical Society
Stothert L
(2019)
A new approach to finding galaxy groups using Markov Clustering
in Monthly Notices of the Royal Astronomical Society: Letters
Aslanyan V
(2024)
A New Field Line Tracer for the Study of Coronal Magnetic Topologies
in The Astrophysical Journal
Wareing C
(2018)
A new mechanical stellar wind feedback model for the Rosette Nebula
in Monthly Notices of the Royal Astronomical Society
Phillips M
(2020)
A new set of atmosphere and evolution models for cool T-Y brown dwarfs and giant exoplanets
in Astronomy & Astrophysics
Pagano P
(2019)
A New Space Weather Tool for Identifying Eruptive Active Regions
in The Astrophysical Journal
Armijo J
(2024)
A new test of gravity - I. Introduction to the method
in Monthly Notices of the Royal Astronomical Society
Armijo J
(2024)
A new test of gravity - II. Application of marked correlation functions to luminous red galaxy samples
in Monthly Notices of the Royal Astronomical Society
Diakogiannis F
(2017)
A novel JEAnS analysis of the Fornax dwarf using evolutionary algorithms: mass follows light with signs of an off-centre merger
in Monthly Notices of the Royal Astronomical Society
Bisbas T
(2014)
A photodissociation region study of NGC 4038
in Monthly Notices of the Royal Astronomical Society
Bisbas T
(2014)
A photodissociation region study of NGC 4038
in Monthly Notices of the Royal Astronomical Society
Heinesen A
(2022)
A prediction for anisotropies in the nearby Hubble flow
in Journal of Cosmology and Astroparticle Physics
| Description | Many new discoveries about the formation and evolution of galaxies, star formation, planet formation have been made possible by the award. |
| Exploitation Route | Many international collaborative projects are supported by the HPC resources provided by DiRAC. |
| Sectors | Aerospace Defence and Marine Creative Economy Digital/Communication/Information Technologies (including Software) Education Manufacturing including Industrial Biotechology Retail Other |
| URL | http://www.dirac.ac.uk |
| Description | Significant co-design project with Hewlett-Packard Enterprise, including partnership in the HPE/Arm/Suse Catalyst UK programme. |
| First Year Of Impact | 2017 |
| Sector | Digital/Communication/Information Technologies (including Software) |
| Impact Types | Societal |
| Description | DiRAC 2.5x Project Office 2017-2020 |
| Amount | £300,000 (GBP) |
| Organisation | Science and Technologies Facilities Council (STFC) |
| Sector | Public |
| Country | United Kingdom |
| Start | 02/2018 |
| End | 03/2020 |
| Title | Citation analysys and Impact |
| Description | Use of IT to determineacademic impact of eInfrastructure |
| Type Of Material | Improvements to research infrastructure |
| Year Produced | 2017 |
| Provided To Others? | Yes |
| Impact | Understood emerging trends in DiRAC Science and helped decide the scale and type of IT investments and direct us to develop new technologies |
| URL | http://www.dirac.ac.uk |
| Title | Runaway gas accretion and ALMA observations |
| Description | VizieR online Data Catalogue associated with article published in journal Monthly Notices of the Royal Astronomical Society with title ' ALMA observations require slower Core Accretion runaway growth.' (bibcode: 2019MNRAS.488L..12N) |
| Type Of Material | Database/Collection of data |
| Year Produced | 2023 |
| Provided To Others? | Yes |
| URL | https://cdsarc.cds.unistra.fr/viz-bin/cat/J/MNRAS/488/L12 |
| Description | Co-design project with Hewlett Packard Enterprise |
| Organisation | Hewlett Packard Enterprise (HPE) |
| Country | United Kingdom |
| Sector | Private |
| PI Contribution | Technical support and operations costs for running the hardware. Research workflows to test the system performance, and investment of academic time and software engineering time to optimise code for new hardware. Project will explore suitability of hardware for DiRAC workflows and provide feedback to HPE. |
| Collaborator Contribution | In-kind provision of research computing hardware. Value is commercially confidential. |
| Impact | As this collaboration is about to commence, there are no outcomes to report at this point. |
| Start Year | 2018 |
| Description | DiRAC |
| Organisation | Science and Technologies Facilities Council (STFC) |
| Department | Distributed Research Utilising Advanced Computing |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | I am the PI for two research grants for the procurement and running of the Complexity@DiRAC High Performance Computing cluster at the University of Leicester. This cluster is now in active operation as a national HPC facility. |
| Collaborator Contribution | DiRAC is the facility which provides HPC resources for the theoretical astrophysics and particle physics communities within STFC. |
| Impact | The establishment and running of a new HPC cluster at the University of Leicester as part of the DiRAC national facility. |
| Start Year | 2011 |
| Description | Nuclei from Lattice QCD |
| Organisation | RIKEN |
| Department | RIKEN-Nishina Center for Accelerator-Based Science |
| Country | Japan |
| Sector | Public |
| PI Contribution | Surrey performed ab initio studies of LQCD-derived nuclear forces |
| Collaborator Contribution | Work by Prof. Hatsuda and collaborators at the iTHEMS and Quantum Hadron Physics Laboratory to provide nuclear forces derived from LQCD |
| Impact | Phys. Rev. C 97, 021303(R) |
| Start Year | 2015 |
| Description | STFC Centres for Doctoral Training in Data Intensive Science |
| Organisation | University of Leicester |
| Department | STFC DiRAC Complexity Cluster (HPC Facility Leicester) |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | Support for STFC Centres for Doctoral Training (CDT) in Data Intensive Science - DiRAC is a partner in five of the eight of the newly established STFC CDTs, and is actively engaged with them in developing industrial partnerships. DiRAC is also offering placements to CDT students interested in Research Software Engineering roles. |
| Collaborator Contribution | Students to work on interesting technical problems for DiRAC |
| Impact | This is the first year |
| Start Year | 2017 |