NNPDF
Fitting proton substructure to observation data using neural networks
Unravelling Proton Structure with Hyperoptimised Machine Learning
At energy-frontier facilities such as the Large Hadron Collider, scientists study the laws of Nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. In this project we will tackle long-standing puzzles in our understanding of the strong interactions, from the origin of the proton spin to the strange content of nucleons. The key to achieve this will be the first-ever universal analysis of nucleon structure from the simultaneous determination of the momentum and spin distributions of quarks and gluons and their fragmentation into hadrons. We will combine an extensive experimental dataset and cutting-edge theory calculations within a machine learning framework where neural networks parametrise the underlying physical laws while minimizing ad-hoc model assumptions. Given that computing resources represent the major bottleneck in this ambitious research program, it will be crucial to optimize the model training by exploiting GPUs. Further, the exploration of the resulting complex parameter space demands an algorithmic strategy to determine the model hyperparameters such as network architectures.
High-throughput GPU computing for New Physics searches with electrons in LHCb
Fitting proton substructure to observation data using neural networks