A new open version streamlines interactions with theoretical physicists
What if you could test a new theory against LHC data? Better yet, what if the specialist knowledge needed to do this was captured in a convenient format? This size order is now on the way from the ATLAS collaboration, with the first open release of the full analysis likelihoods of an LHC experiment.
“In particle physics, experimenters develop a very rich summary of measurements, which takes into account all relevant scattering processes and every source of uncertainty, encapsulated in what we call probabilities“, explains Lukas Heinrich, research fellow at CERN working for the ATLAS experiment. “Probabilities allow you to calculate the likelihood that the data observed in a particular experiment fits a specific model or theory. Indeed, it summarizes all aspects of a particular analysis, from detector parameters, event selection, expected signals and background processes, to uncertainties and theoretical models. Extraordinarily complex and essential to any analysis, likelihoods are one of the most valuable tools produced by LHC experiments. Their public release will now allow phenomenologists around the world to explore ATLAS data in a whole new way.
ATLAS open likelihoods are available at HEPData, an open-access repository for experimental particle physics data. the first open likelihoods were intended for the search for supersymmetry in proton-proton collisions containing Higgs bosons, numerous jets of b quarks and a missing transverse momentum. “While ATLAS had published likelihood analyzes focusing on the Higgs boson in 2013, these did not reveal the full complexity of the measurements,” says Kyle Cranmer, a professor at New York University. “We hope that this first version – which provides all the verisimilitudes in all their glory – will form a new bridge of communication between theorists and experimenters, enriching the discourse between the communities.”
The search for new physics will greatly benefit from open likelihoods. “If you’re a theorist developing a new idea, your first question is probably: ‘Is my model already excluded from the LHC experiments?’ says Giordon Stark, postdoctoral fellow at SCIPP, UC Santa Cruz. “Until now, there was no easy way to answer that.” Most LHC research focuses on certain reference models, giving a detailed analysis of the data to determine whether or not the standard model holds in the processes under consideration and, if so, which model parameters are always allowed and which are excluded. by the data. . But, of course, any research analysis is sensitive to many new physics scenarios.
By using publicly available probabilities, theorists will now be able to modify the original hypothesis studied by ATLAS. Although such results may not reach the precision of the original result – given the missing step of simulating the putative processes in the ATLAS detector – they will give theorists a quick assessment of the potential of their new theory.
By using these public probabilities, theorists will now be able to modify the original hypothesis studied by the ATLAS collaboration.
Provide the tools for open analysis
But why do you need likelihoods to understand ATLAS data? Like many public scientific datasets, data from LHC experiments can be impenetrable without domain-specific knowledge. Before this can begin to make sense, there is a large set of detector and software parameters to consider, as well as complex theoretical modeling.
“Instead, the ATLAS collaboration focused on open data resources,” says Matthew Feickert, postdoctoral research associate at the University of Illinois at Urbana-Champaign. “It is our responsibility to minimize the complexity that stands between theorists and the relevant ATLAS information. There are many valuable questions that theorists outside the ATLAS experiment can help us answer, and we need to give them the best tools to do so.
Since the early days of the LHC, there has been a strong consensus between the experimental and theoretical physics communities that this could best be done by publicly publishing the analysis probabilities. Formats that have been developed internally by experiments to share likelihoods are not well suited for publication or easy use by the theoretical community. “Recently, we rewrote the software for probabilities to take advantage of machine learning frameworks, and realized that it also offered the possibility of solving the publication problem,” says Heinrich. “We’ve also made sure to choose human-readable and machine-readable formats. This way, even as technologies and software evolve, the probabilities will still be usable.
Securing the future of open physics
“We plan to make open publication of probabilities a regular part of our publication process, and we have already made them available of a search for the direct production of pairs of tau sleptons”, explains Laura Jeanty, coordinator of the ATLAS Supersymmetry working group. “Over the next few months, we aim to gather feedback from theorists outside of the collaboration to better understand how they are using this new resource to further refine future versions.”
Public dissemination of full likelihoods will also bring significant benefits to experimenters. “Probability is a key ingredient for combining analyzes from different experiments,” says Stark, who organizes the statistical combinations of ATLAS supersymmetry results. “As their open release becomes more mainstream, I look forward to seeing larger-scale statistical combinations.”
In parallel, the ATLAS Supersymmetry and Exotics working groups have also established a new approach to the preservation of analyses. “This is part of our effort to ‘sustain’ ATLAS results,” says Federico Meloni, head of the ATLAS Supersymmetry Working Group. “As theorists develop new ideas, the ATLAS data may need to be revisited. Thus, we now archive the software and analysis tools used in the result before its publication. This will facilitate future reinterpretations of the data, years later. When these archived analysis pipelines are paired with published probabilities, physicists will be equipped with a transformative capability: the ability to test new theory against data in an automated way.
Together, these developments mark a new approach to open and reproducible research at the LHC. The ATLAS collaboration will continue to focus on creating rich, preservable open access tools – such as open probabilities – and look forward to the compelling new insights they will create.