DiRAC-2: Recurrent Costs for Complexity@DiRAC Cluster at University of Leicester
Lead Research Organisation:
University of Leicester
Department Name: Physics and Astronomy
Abstract
This award is for the recurrent costs of Complexity@DiRAC cluster at the the University of Leicester. It will cover electricity costs, support staff costs of the cluster which is part of the DiRAC-2 national facility.
Planned Impact
The pathways to impact for the project are as agreed at the DiRAC PMB meeting on 21 November 2011 and subsequently reported on in the annual reports of the facility.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
The high-performance computing applications supported by DiRAC typically involve new algorithms and implementations optimised for high energy efficiency which impose demands on computer architectures that the computing industry has found useful for hardware and system software design and testing.
DiRAC researchers have on-going collaborations with computing companies that maintain this strong connection between the scientific goals of the DiRAC Consortium and the development of new computing technologies that drive the commercial high-performance computing market, with economic benefits to the companies involved and more powerful computing capabilities available to other application areas including many that address socio-economic challenges.
Boyle (University of Edinburgh) co-designed the Blue-Gene/Q compute chip with IBM. This is now deployed in 1.3 Pflop/s systems at Edinburgh and Daresbury and 15 other sites in the world, including the world's largest system at Lawrence Livermore Labs. This is the greenest HPC architecture in the world and offers a route to cheap affordable petascale and exascale computing that will have profound effects on Energy, Health, Environment and Security sectors.
Boyle and IBM have 4 US patents pending resulting from the Blue Gene/Q chip set design project with IBM. Boyle was a co-author of IBM's Gauss Award winning paper at the International Supercomputing conference and has co-authored IEEE and IBM Journal papers on the Blue Gene/Q architecture with IBM.
Falle (Leeds University) partially developed the MG code on DiRAC. This has been used in the National Grid COOLTRANS project to model dispersion of CO2 from high pressure pipelines carrying CO2 for carbon sequestration.
At UCL, a virtual quantum laboratory suite has been created by the UCL spinout firm, QUANTEMOL. It has application in industry, energy, health and environmental monitoring.
Calleja (Cambridge University) is using DiRAC to work with Xyratex, the UK's leading disk manufacturer, to develop the fastest storage arrays in the world.
The COSMOS consortium (Shellard) has had a long-standing collaboration with SGI (since 1997) and with Intel (since 2003) which has allowed access to leading-edge shared-memory technologies, inlcuding the world's first UV2000 in 2012, which was also the first SMP system enabled with Intel Phi (KnightsCorner) processors. Adaptive Computing are using the COSMOS@DiRAC platform to develop a single-image version of their MOAB HPC Suite.
Publications
Idini A
(2019)
Ab Initio Optical Potentials and Nucleon Scattering on Medium Mass Nuclei.
in Physical review letters
Power C
(2019)
nIFTy galaxy cluster simulations VI: the dynamical imprint of substructure on gaseous cluster outskirts.
in Monthly Notices of the Royal Astronomical Society
Rosito M
(2019)
Assembly of spheroid-dominated galaxies in the EAGLE simulation
in Astronomy & Astrophysics
Stevenson P
(2019)
Low-energy heavy-ion reactions and the Skyrme effective interaction
in Progress in Particle and Nuclear Physics
Barbieri C
(2019)
Lepton scattering from Ar 40 and Ti 48 in the quasielastic peak region
in Physical Review C
Nayakshin, S
(2019)
ALMA observations require slower Core Accretion runaway growth
Gurung-López S
(2019)
Lya emitters in a cosmological volume II: the impact of the intergalactic medium
in Monthly Notices of the Royal Astronomical Society
Battino U
(2019)
NuGrid stellar data set - III. Updated low-mass AGB models and s-process nucleosynthesis with metallicities Z= 0.01, Z = 0.02, and Z = 0.03
in Monthly Notices of the Royal Astronomical Society
Davies C
(2019)
Improving the kinetic couplings in lattice nonrelativistic QCD
in Physical Review D
Hori K
(2019)
Anelastic torsional oscillations in Jupiter's metallic hydrogen region
in Earth and Planetary Science Letters
Grisdale K
(2019)
On the observed diversity of star formation efficiencies in Giant Molecular Clouds
in Monthly Notices of the Royal Astronomical Society
Baugh C
(2019)
Galaxy formation in the Planck Millennium: the atomic hydrogen content of dark matter haloes
in Monthly Notices of the Royal Astronomical Society
Fossati M
(2019)
The MUSE Ultra Deep Field (MUDF). II. Survey design and the gaseous properties of galaxy groups at 0.5 < z < 1.5
in Monthly Notices of the Royal Astronomical Society
Humphries J
(2019)
On the origin of wide-orbit ALMA planets: giant protoplanets disrupted by their cores
in Monthly Notices of the Royal Astronomical Society
Pittard J
(2019)
Momentum and energy injection by a supernova remnant into an inhomogeneous medium
in Monthly Notices of the Royal Astronomical Society
Trayford J
(2019)
Resolved galaxy scaling relations in the eagle simulation: star formation, metallicity, and stellar mass on kpc scales
in Monthly Notices of the Royal Astronomical Society
Gurung-López S
(2019)
Lya emitters in a cosmological volume - I. The impact of radiative transfer
in Monthly Notices of the Royal Astronomical Society
Harvey D
(2019)
Observable tests of self-interacting dark matter in galaxy clusters: BCG wobbles in a constant density core
in Monthly Notices of the Royal Astronomical Society
Young R
(2019)
Simulating Jupiter's weather layer. Part II: Passive ammonia and water cycles
in Icarus
McLean E
(2019)
Lattice QCD form factor for B s ? D s * l ? at zero recoil with nonperturbative current renormalization
in Physical Review D
Cautun M
(2019)
The aftermath of the Great Collision between our Galaxy and the Large Magellanic Cloud
in Monthly Notices of the Royal Astronomical Society
Hatton D
(2019)
Renormalizing vector currents in lattice QCD using momentum-subtraction schemes
in Physical Review D
Jauzac M
(2019)
The core of the massive cluster merger MACS J0417.5-1154 as seen by VLT/MUSE
in Monthly Notices of the Royal Astronomical Society
Sohn W
(2019)
CMB-S4 forecast on the primordial non-Gaussianity parameter of feature models
in Physical Review D
Nightingale J
(2019)
Galaxy structure with strong gravitational lensing: decomposing the internal mass distribution of massive elliptical galaxies
in Monthly Notices of the Royal Astronomical Society
Davies C
(2019)
Determination of the quark condensate from heavy-light current-current correlators in full lattice QCD
in Physical Review D
Young R
(2019)
Simulating Jupiter's weather layer. Part II: Passive ammonia and water cycles
in Icarus
Muia F
(2019)
The fate of dense scalar stars
in Journal of Cosmology and Astroparticle Physics
Liao S
(2019)
Ultra-diffuse galaxies in the Auriga simulations
in Monthly Notices of the Royal Astronomical Society
Shao S
(2019)
Screening maps of the local Universe I - Methodology
in Monthly Notices of the Royal Astronomical Society
Naik A
(2019)
Constraints on chameleon f(R)-gravity from galaxy rotation curves of the SPARC sample
in Monthly Notices of the Royal Astronomical Society
Shanahan R
(2019)
Strong Excess Faraday Rotation on the Inside of the Sagittarius Spiral Arm
in The Astrophysical Journal Letters
Martin G
(2019)
The formation and evolution of low-surface-brightness galaxies
in Monthly Notices of the Royal Astronomical Society
Callingham T
(2019)
The mass of the Milky Way from satellite dynamics
in Monthly Notices of the Royal Astronomical Society
Horsley R
(2019)
Isospin splittings in the decuplet baryon spectrum from dynamical QCD + QED
in Journal of Physics G: Nuclear and Particle Physics
Davoudi Z
(2019)
Theoretical aspects of quantum electrodynamics in a finite volume with periodic boundary conditions
in Physical Review D
Lovell M
(2019)
The signal of decaying dark matter with hydrodynamical simulations
in Monthly Notices of the Royal Astronomical Society
Guervilly C
(2019)
Turbulent convective length scale in planetary cores.
in Nature
Coles P
(2019)
ExoMol molecular line lists - XXXV. A rotation-vibration line list for hot ammonia
in Monthly Notices of the Royal Astronomical Society
Regan J
(2019)
Super-Eddington accretion and feedback from the first massive seed black holes
in Monthly Notices of the Royal Astronomical Society
Bijnens J
(2019)
Electromagnetic finite-size effects to the hadronic vacuum polarization
in Physical Review D
Wen K
(2019)
Dissipation Dynamics of Nuclear Fusion Reactions
in Acta Physica Polonica B
Davies C
(2019)
Meson electromagnetic form factors from lattice QCD
Pawlik M
(2019)
The diverse evolutionary pathways of post-starburst galaxies
in Nature Astronomy
Kimm T
(2019)
Understanding the escape of LyC and Lya photons from turbulent clouds
in Monthly Notices of the Royal Astronomical Society
McNally C
(2019)
Multiplanet systems in inviscid discs can avoid forming resonant chains
in Monthly Notices of the Royal Astronomical Society: Letters
Rhodin N
(2019)
The nature of strong H i absorbers probed by cosmological simulations: satellite accretion and outflows
in Monthly Notices of the Royal Astronomical Society
Nixon C
(2019)
What is wrong with steady accretion discs?
in Astronomy & Astrophysics
Wareing C
(2019)
Sheets, filaments, and clumps - high-resolution simulations of how the thermal instability can form molecular clouds
in Monthly Notices of the Royal Astronomical Society
Description | Many new discoveries about the formation and evolution of galaxies, star formation, planet formation have been made possible by the award. |
Exploitation Route | Many international collaborative projects are supported by the HPC resources provided by DiRAC. |
Sectors | Aerospace, Defence and Marine,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Manufacturing, including Industrial Biotechology,Retail,Other |
URL | http://www.dirac.ac.uk |
Description | Significant co-design project with Hewlett-Packard Enterprise, including partnership in the HPE/Arm/Suse Catalyst UK programme. |
First Year Of Impact | 2017 |
Sector | Digital/Communication/Information Technologies (including Software) |
Impact Types | Societal |
Description | DiRAC 2.5x Project Office 2017-2020 |
Amount | £300,000 (GBP) |
Organisation | Science and Technologies Facilities Council (STFC) |
Sector | Public |
Country | United Kingdom |
Start | 02/2018 |
End | 03/2020 |
Title | Citation analysys and Impact |
Description | Use of IT to determineacademic impact of eInfrastructure |
Type Of Material | Improvements to research infrastructure |
Year Produced | 2017 |
Provided To Others? | Yes |
Impact | Understood emerging trends in DiRAC Science and helped decide the scale and type of IT investments and direct us to develop new technologies |
URL | http://www.dirac.ac.uk |
Description | Co-design project with Hewlett Packard Enterprise |
Organisation | Hewlett Packard Enterprise (HPE) |
Country | United Kingdom |
Sector | Private |
PI Contribution | Technical support and operations costs for running the hardware. Research workflows to test the system performance, and investment of academic time and software engineering time to optimise code for new hardware. Project will explore suitability of hardware for DiRAC workflows and provide feedback to HPE. |
Collaborator Contribution | In-kind provision of research computing hardware. Value is commercially confidential. |
Impact | As this collaboration is about to commence, there are no outcomes to report at this point. |
Start Year | 2018 |
Description | Nuclei from Lattice QCD |
Organisation | RIKEN |
Department | RIKEN-Nishina Center for Accelerator-Based Science |
Country | Japan |
Sector | Public |
PI Contribution | Surrey performed ab initio studies of LQCD-derived nuclear forces |
Collaborator Contribution | Work by Prof. Hatsuda and collaborators at the iTHEMS and Quantum Hadron Physics Laboratory to provide nuclear forces derived from LQCD |
Impact | Phys. Rev. C 97, 021303(R) |
Start Year | 2015 |
Description | STFC Centres for Doctoral Training in Data Intensive Science |
Organisation | University of Leicester |
Department | STFC DiRAC Complexity Cluster (HPC Facility Leicester) |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Support for STFC Centres for Doctoral Training (CDT) in Data Intensive Science - DiRAC is a partner in five of the eight of the newly established STFC CDTs, and is actively engaged with them in developing industrial partnerships. DiRAC is also offering placements to CDT students interested in Research Software Engineering roles. |
Collaborator Contribution | Students to work on interesting technical problems for DiRAC |
Impact | This is the first year |
Start Year | 2017 |