Parton Distributions with Electroweak corrections

Lead Research Organisation: University of Oxford
Department Name: Oxford Physics

Abstract

The Large Hadron Collider (LHC) is the most powerful particle accelerator ever built: in its 30 km long tunnel below Geneva in Switzerland, protons are accelerated to almost the speed of light and then made collide. These high-energy collisions generate a state similar in terms of energy and temperature to what happened shortly after the Big Bang. In these extreme conditions, physicists study the laws of nature at the smallest distances ever probed, and try to answer some of the ultimate questions that mankind has always wondered about, like the origin of the universe, the nature of forces and the quest for the ultimate constituents of matter. Since its start--up in 2009, the LHC has been operating with superb performance. The discovery of the Higgs Boson has been its biggest achievement up to date, awarded with the Nobel Prize in Physics in 2013. After a shutdown of two years, the LHC will restart in 2015 doubling its energy. There are good reasons to expect that over its lifespan the LHC will continue to further revolutionize our understanding of the nature of fundamental forces and matter. For instance, the Higgs mechanism, thanks to which elementary particles acquire their mass, will be scrutinized with increased precision, and the exploration of the energy frontier will enter a wild new region, searching for new particles and interactions beyond the Standard Model of particle physics, including possible Dark Matter particles.

It is therefore of utmost importance to provide accurate theoretical predictions for the relevant processes at the LHC, including Higgs production, as well as for a variety of new physics scenarios. Crucial ingredients of these predictions are Parton Distribution Functions (PDFs) of the proton, which encode the dynamics determining how the proton's energy is split among its constituents, quarks and gluons, in each LHC collision. Modern PDF sets are extracted from data using calculations in Quantum Chromodynamics (QCD), the theory that governs the interactions of quarks and gluons. However, the excellent precision achieved in QCD calculations implies that now, at the dawn of Run II of the LHC, it becomes mandatory to account for other important types of corrections: electroweak effects, which involve both photons and the W and Z weak bosons. Accounting for these effects is of vital importance to fully exploit the potential of the upcoming Run II LHC data for our understanding of the structure of the proton.

In this project, we aim to develop a fully general and automated fast interface for electroweak calculations in PDF fits. Indeed, a major difficulty to include electroweak effects in PDF fits are the fact that the relevant calculations are either unknown, not publicly available or not available in a format that can be used in PDF fits. I propose to solve these problems in a fully general manner using MadGraph5_aMC@NLO, a program that achieves the complete automation of the computations of next-to-leading order cross sections. Recently, I have developed aMCfast, a fast interface to MadGraph5_aMC@NLO that allows to trivially including any NLO process, no matter how complicated, in a PDF analysis. Another goal of this project is to provide a precision determination of the photon PDF from LHC data. Indeed, In the presence of Quantum Electrodynamics (QED), a striking prediction from the theory is that the proton will be composed not only by quarks and gluons, but also by photons. And Feynman diagrams with incoming photon legs lead to a sizable effects, comparable to pure weak effects, in many LHC processes. Finally, I plan to generalize the PDF evolution equations, which determine how the inner content of the proton is modified when we resolve it at smaller and smaller distances, to include electroweak effects, and study for the first time ever their implications in LHC phenomenology.

Planned Impact

To begin with, one of the main challenges of modern scientific research is making its results accessible to everyone. Indeed, the open science paradigm is an important ingredient towards maximizing the impact of science into society. As a result of the research proposed in this grant, the open-source code aMCfast will allow any interested parties to use the results of high-precision calculations into their own analysis. While these results are mostly of interested for the academic community, the same methods could be applied in a variety of applications relevant for society as a whole, and in particular to improve the research capacity and skills of technological companies. The central idea of the aMCfast framework is that, given a complicated computer software program, to precompute and interpolate all the CPU-time intensive tasks and then provide to the public a extremely fast and efficient interface which delivers the same performance as the original computer program, but in a greatly reduced amount of time. During this Grant, I will explore, with the support of ISIS Innovation, which manages technology transfer in Oxford, the opportunities to use these methods for non-academic applications, for example using the synergies with the technological start-ups in the Oxford area.

Another possibility of society-wide impact of the research proposed in this Grant is the following. In order to estimate the uncertainties in the global analysis of the proton structure, I have been one of the main developers of a rather sophisticated mathematical framework, known as the NNPDF framework, which combines the high-level statistical techniques of Monte Carlo methods, machine learning and Bayesian Inference, with their robust implementation in user-friendly object-oriented software tools. With all this know-how acquired in estimating uncertainty, it is possible and desirable to extend these tools to other applications that would be beneficial for the society. One possibility which could have substantial impact, and that will be studied in more detail if the Grant is funded, would be develop an open-access software tool able to perform robust, quantitative cost/benefit analyses and risk assessments of basic research projects. Such a tool would provide both researchers and decision-makers with key input in order to either prepare research proposals or to value which ones should be funded based on quantitative indicators. This approach would inherit the techniques used in the NNPDF analysis, namely the use of artificial intelligence technique such as Neural Networks, and the robust propagation of model uncertainties from the input data to the final forecast, including the effects of correlations.

Finally, let us mention that one of the main outcomes of this Grant will be the training of a Ph. D. student into the state-of-the-art methods for data analysis and numerical simulations in high-energy physics. Thanks to this high-quality training, the Ph.D. student will acquire a series of skills that will be of great importance if he/she chooses to follow a non-academic path. For instance, the data analysis methods that will be used with the NNPDF software are based on a series of mathematical tools like artificial neural networks and genetic algorithms that are used in many branches of the industry from data mining software to banking and investment. Also, thanks to a strong formation in high-level programming languages like C++ and Python, the student should be able to easily follow a career on the computing industry. Moving from the academic world to industry is specially easy in the Oxford environment, thanks to the many scientific and technological start-ups that have developed in the recent years. Similar considerations apply to the PDRA.

Publications

10 25 50
publication icon
Gauld R (2016) The prompt atmospheric neutrino flux in the light of LHCb in Journal of High Energy Physics

publication icon
Czakon M (2016) Summary of the Topical Workshop on Top Quark Differential Distributions 2014 in Journal of Physics G: Nuclear and Particle Physics

publication icon
Butterworth J (2016) PDF4LHC recommendations for LHC Run II in Journal of Physics G: Nuclear and Particle Physics

publication icon
Rojo J (2016) Parton Distributions based on a Maximally Consistent Dataset in Nuclear and Particle Physics Proceedings

publication icon
Ball R (2016) Intrinsic charm in a matched general-mass scheme in Physics Letters B

publication icon
Lin H (2018) Parton distributions and lattice QCD calculations: A community white paper in Progress in Particle and Nuclear Physics

publication icon
Ball RD (2017) Parton distributions from high-precision collider data: NNPDF Collaboration. in The European physical journal. C, Particles and fields

publication icon
Ball RD (2016) A determination of the charm content of the proton: The NNPDF Collaboration. in The European physical journal. C, Particles and fields

publication icon
Beenakker W (2016) NLO+NLL squark and gluino production cross sections with threshold-improved parton distributions. in The European physical journal. C, Particles and fields

publication icon
Ball RD (2016) The asymptotic behaviour of parton distributions at small and large x. in The European physical journal. C, Particles and fields

publication icon
Carrazza S (2016) Specialized minimal PDFs for optimized LHC calculations. in The European physical journal. C, Particles and fields

 
Description Explained in previous report
Exploitation Route Outlined in previous report
Sectors Education

 
Description AMVA4NewPhysics Marie Curie Initial Training Network of the European Comission 
Organisation European Commission
Country European Union (EU) 
Sector Public 
PI Contribution Exploitation of advanced multivariate techniques for the search for New Physics at the Large Hadron Collider
Collaborator Contribution General organisation of the network and research activities.
Impact .
Start Year 2015