DiRAC-2: Recurrent Costs for Complexity@DiRAC Cluster at University of Leicester
Lead Research Organisation:
University of Leicester
Department Name: Physics and Astronomy
Abstract
This award is for the recurrent costs of Complexity@DiRAC cluster at the the University of Leicester. It will cover electricity costs, support staff costs of the cluster which is part of the DiRAC-2 national facility.
Planned Impact
The pathways to impact for the project are as agreed at the DiRAC PMB meeting on 21 November 2011 and subsequently reported on in the annual reports of the facility.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
Publications
Baek G
(2020)
Radiative Transfer Modeling of EC 53: An Episodically Accreting Class I Young Stellar Object
in The Astrophysical Journal
Baes M
(2019)
The cosmic spectral energy distribution in the EAGLE simulation
in Monthly Notices of the Royal Astronomical Society
Bahé Y
(2021)
Strongly lensed cluster substructures are not in tension with ?CDM
in Monthly Notices of the Royal Astronomical Society
Bahé Y
(2022)
The importance of black hole repositioning for galaxy formation simulations
in Monthly Notices of the Royal Astronomical Society
Bahé Y
(2019)
Disruption of satellite galaxies in simulated groups and clusters: the roles of accretion time, baryons, and pre-processing
in Monthly Notices of the Royal Astronomical Society
Balan S
(2025)
Resonant or asymmetric: the status of sub-GeV dark matter
in Journal of Cosmology and Astroparticle Physics
Ballabio G
(2021)
HD 143006: circumbinary planet or misaligned disc?
in Monthly Notices of the Royal Astronomical Society
Ballabio G
(2018)
Enforcing dust mass conservation in 3D simulations of tightly coupled grains with the Phantom SPH code
in Monthly Notices of the Royal Astronomical Society
Bamber J
(2021)
Quasinormal modes of growing dirty black holes
in Physical Review D
Banerjee A
(2024)
Atmospheric Retrievals Suggest the Presence of a Secondary Atmosphere and Possible Sulfur Species on L98-59 d from JWST Nirspec G395H Transmission Spectroscopy
in The Astrophysical Journal Letters
Banfi A
(2016)
Two-Jet Rate in e^{+}e^{-} at Next-to-Next-to-Leading-Logarithmic Order.
in Physical review letters
Bantilan H
(2021)
Cauchy evolution of asymptotically global AdS spacetimes with no symmetries
in Physical Review D
Bantilan H
(2020)
Real-Time Dynamics of Plasma Balls from Holography
in Physical Review Letters
Bantilan H
(2019)
End point of nonaxisymmetric black hole instabilities in higher dimensions
in Physical Review D
Baraffe I
(2017)
Lithium Depletion in Solar-like Stars: Effect of Overshooting Based on Realistic Multi-dimensional Simulations
in The Astrophysical Journal Letters
Baraffe I
(2022)
Local heating due to convective overshooting and the solar modelling problem
in Astronomy & Astrophysics
Barausse E
(2020)
Prospects for fundamental physics with LISA
in General Relativity and Gravitation
Barber C
(2019)
Calibrated, cosmological hydrodynamical simulations with variable IMFs III: spatially resolved properties and evolution
in Monthly Notices of the Royal Astronomical Society
Barbieri C
(2019)
Lepton scattering from Ar 40 and Ti 48 in the quasielastic peak region
in Physical Review C
Barker A
(2019)
Angular momentum transport by the GSF instability: non-linear simulations at the equator
in Monthly Notices of the Royal Astronomical Society
Barnes D
(2021)
Characterizing hydrostatic mass bias with mock-X
in Monthly Notices of the Royal Astronomical Society
Barone A
(2023)
Approaches to inclusive semileptonic B(s)-meson decays from Lattice QCD
in Journal of High Energy Physics
Barone T
(2024)
Gravitational lensing reveals cool gas within 10-20 kpc around a quiescent galaxy
in Communications Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part I. Methodology and code description
in Journal of Cosmology and Astroparticle Physics
Barrera-Hinojosa C
(2020)
GRAMSES: a new route to general relativistic N -body simulations in cosmology. Part II. Initial conditions
in Journal of Cosmology and Astroparticle Physics
| Description | Many new discoveries about the formation and evolution of galaxies, star formation, planet formation have been made possible by the award. |
| Exploitation Route | Many international collaborative projects are supported by the HPC resources provided by DiRAC. |
| Sectors | Aerospace Defence and Marine Creative Economy Digital/Communication/Information Technologies (including Software) Education Manufacturing including Industrial Biotechology Retail Other |
| URL | http://www.dirac.ac.uk |
| Description | Significant co-design project with Hewlett-Packard Enterprise, including partnership in the HPE/Arm/Suse Catalyst UK programme. |
| First Year Of Impact | 2017 |
| Sector | Digital/Communication/Information Technologies (including Software) |
| Impact Types | Societal |
| Description | DiRAC 2.5x Project Office 2017-2020 |
| Amount | £300,000 (GBP) |
| Organisation | Science and Technologies Facilities Council (STFC) |
| Sector | Public |
| Country | United Kingdom |
| Start | 02/2018 |
| End | 03/2020 |
| Title | Citation analysys and Impact |
| Description | Use of IT to determineacademic impact of eInfrastructure |
| Type Of Material | Improvements to research infrastructure |
| Year Produced | 2017 |
| Provided To Others? | Yes |
| Impact | Understood emerging trends in DiRAC Science and helped decide the scale and type of IT investments and direct us to develop new technologies |
| URL | http://www.dirac.ac.uk |
| Title | Runaway gas accretion and ALMA observations |
| Description | VizieR online Data Catalogue associated with article published in journal Monthly Notices of the Royal Astronomical Society with title ' ALMA observations require slower Core Accretion runaway growth.' (bibcode: 2019MNRAS.488L..12N) |
| Type Of Material | Database/Collection of data |
| Year Produced | 2023 |
| Provided To Others? | Yes |
| URL | https://cdsarc.cds.unistra.fr/viz-bin/cat/J/MNRAS/488/L12 |
| Description | Co-design project with Hewlett Packard Enterprise |
| Organisation | Hewlett Packard Enterprise (HPE) |
| Country | United Kingdom |
| Sector | Private |
| PI Contribution | Technical support and operations costs for running the hardware. Research workflows to test the system performance, and investment of academic time and software engineering time to optimise code for new hardware. Project will explore suitability of hardware for DiRAC workflows and provide feedback to HPE. |
| Collaborator Contribution | In-kind provision of research computing hardware. Value is commercially confidential. |
| Impact | As this collaboration is about to commence, there are no outcomes to report at this point. |
| Start Year | 2018 |
| Description | DiRAC |
| Organisation | Science and Technologies Facilities Council (STFC) |
| Department | Distributed Research Utilising Advanced Computing |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | I am the PI for two research grants for the procurement and running of the Complexity@DiRAC High Performance Computing cluster at the University of Leicester. This cluster is now in active operation as a national HPC facility. |
| Collaborator Contribution | DiRAC is the facility which provides HPC resources for the theoretical astrophysics and particle physics communities within STFC. |
| Impact | The establishment and running of a new HPC cluster at the University of Leicester as part of the DiRAC national facility. |
| Start Year | 2011 |
| Description | Nuclei from Lattice QCD |
| Organisation | RIKEN |
| Department | RIKEN-Nishina Center for Accelerator-Based Science |
| Country | Japan |
| Sector | Public |
| PI Contribution | Surrey performed ab initio studies of LQCD-derived nuclear forces |
| Collaborator Contribution | Work by Prof. Hatsuda and collaborators at the iTHEMS and Quantum Hadron Physics Laboratory to provide nuclear forces derived from LQCD |
| Impact | Phys. Rev. C 97, 021303(R) |
| Start Year | 2015 |
| Description | STFC Centres for Doctoral Training in Data Intensive Science |
| Organisation | University of Leicester |
| Department | STFC DiRAC Complexity Cluster (HPC Facility Leicester) |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | Support for STFC Centres for Doctoral Training (CDT) in Data Intensive Science - DiRAC is a partner in five of the eight of the newly established STFC CDTs, and is actively engaged with them in developing industrial partnerships. DiRAC is also offering placements to CDT students interested in Research Software Engineering roles. |
| Collaborator Contribution | Students to work on interesting technical problems for DiRAC |
| Impact | This is the first year |
| Start Year | 2017 |