Materials and Molecular Modelling Exascale Design and Development Working Group

Lead Research Organisation: University College London
Department Name: Chemistry


High Performance Computers (HPC), or supercomputers, offer exciting opportunities in understanding, developing and increasingly predicting the properties of complex materials through atomistic and electronic structure modelling; and the scope and power of our computational techniques continue to expand as the capability of the hardware grows. The advent of exascale systems is the next dramatic step in this evolution. There is a high cost of both purchasing and running such a system, so it is imperative that appropriate software is developed before users gain access to exascale facilities. The investigators of this project are internationally leading experts in developing (enabling new science) and optimising (making simulations more efficient) state-of-the-art materials software for running simulations on HPC, based here and abroad. Software that we have developed is used both in academia and in industry. Currently, our community consumes over a third of the UK's HPC facility (ARCHER) of approximately 120,000 cores (with a peak performance of 2.5x10^15 Flop/s) and we do not anticipate any delays in immediately getting the most out of the successor to ARCHER, which will be composed of approximately 750,000 cores (estimated at ~ 28x10^15 Flop/s) when it becomes available later this year to the UK academic community. Exascale computers (by definition >10^18 Flop/s) will be composed of many more cores. However, it is anticipated that the next generations of national computers will not provide a smooth transition from the existing infrastructures, but instead will undergo a step change for the UK national facilities, with a shift from conventional CPU based architectures to CPUs hosting (multiple) many-core accelerators. Many, if not all, of our software packages will require major changes before these architectures can be fully exploited. Appropriate changes to the software will effect a reduction in data exchange between cores, management of communications between CPU and accelerators, and, moreover, adaptation in our procedures for handling input and output data.
As requested in the Call, we will form a design and development working group (DDWG) by bringing together Research Software Engineers (RSEs) and experts from mathematics and computer science with a wide range of domain experts in Materials and Molecular Modelling (MMM). We represent a very large and important community which has long established mechanisms for development, resource management, and disseminating best practices that the DDWG will exploit. To maximise the impact of the available UK funds for exascale computing, we identify solutions that will benefit most of our community.

Our DDWG aims to separate out the fundamental mathematics of the problem from the computer science of implementation. We will exploit best current practices and those under development in our domain and across other disciplines in particular targeting libraries that can be called by many materials software and offer a route to heterogeneous architectures. In a complementary development, we will tackle new workflows to manage and analyse vast volumes of simulation data. We will gain valuable experience from our regular meetings and meetings with other DDWGs; and knowledge transfer with our national and international project partners. Undertaking of the initial work and identifying what is required (work earmarked as part of the next funding stage) will enrich our expertise and facilitate international collaborations with developers of materials software and users of overseas exascale computers.

This work enables the UK MMM community to use exascale HPC resources efficiently to address many EPSRC Grand Challenges, including Emergence and Nanoscale Design of Functional Materials (Physics); Dial a Molecule and Directed Assembly of Extended Structures with Targeted Properties (Chemical Sciences); and Engineering From Atoms to Applications (Engineering).

Planned Impact

The dramatic increase in research capabilities brought in by the onset of exascale computing in the fields of Materials and Molecular Modelling will propagate on practically every area of economics and daily life of ordinary people in the UK and across the world.

Materials performance underpins a large number of industrial processes, which are instrumental in maintaining global wealth and health, as well as playing a key role in developing processes that are both environmentally and economically sustainable. Our community has always led the early use of latest HEC hardware and our successful exploitation of Exascale Hardware will have an impact on: the industrial sector, including chemicals, pharmaceuticals, biomaterials, energy, and electronics industries; on society more generally; and on academic communities in chemistry, physics, materials and life and earth sciences, and computational science. Bringing together our community along with experts outside of our domain will ensure the continuing leadership of UK science in a strongly competitive field.

The specific areas of impact will be:

(i) Industry, where modelling and simulation are now integral tools in the design and optimisation of materials. Simulations made possible via the exploitation of exascale hardware will have direct relevance to industries and our members have active collaborations with several UK industries, including Johnson Matthey, AstraZeneca, Glaxo Smith Kline, Pfizer, Bristol-Myers Squibb, Process Systems Enterprise Limited, Britest Limited, Perceptive Engineering Ltd and BP. These industrial links enable the project to contribute to the long-term, continuing competitiveness of the UK economy.

(ii) The General Public and policy makers to whom the work of our community will be communicated by the MMM Hub, MCC and UKCP websites, and CCP9 and a variety of outreach events with which we will promote the key role of materials developments and computational modelling in areas of general interest to the public including energy technologies and policy.

(iii) Academic Groups - both experimental and computational - where the extensive network of our community will ensure the effective dissemination of its results with much of the work feeding into other projects. The software developed will be of wide benefit to both academic and industrial users

(iv) RSE Staff - This is a high-profile, community effort in which RSE staff play a pivotal role. Engagement with this project will raise the profile of the RSE staff, promote greater engagement with the UK and international materials modelling communities, and offer many opportunities to gain experience with cutting-edge high-performance computing techniques in a research context, significantly enhancing the RSEs' career opportunities. Beyond the immediate RSE staff, this project further develops the UK RSE skill base to support exascale computing, ensuring UK researchers have the support they need to exploit exascale systems worldwide. The leading-edge skills and techniques employed in this project serve as exemplars to the HPC community, and will also lead to a "trickle down" effect, bringing significant improvements to smaller scale Tier 2 and Tier 3 HPC projects. In addition to domain-specific conferences, the work will be highlighted to the RSE community via the RSE Slack Channel and presented at the UK RSE Conference, as well as HPC conferences (e.g. Supercomputing and PASC).


10 25 50