When the Large Hadron Collider (LHC) is operating, it produces more than one billion proton–proton interactions every second. But exactly how many take place in the LHC experiments? Critical to every analysis of LHC data is a high-precision measurement of what is known as luminosity, that is, the total number of proton–proton interactions in a given dataset. It allows physicists to evaluate the probability of interesting proton–proton collision events occurring, as well as to predict the rates of similar-looking background processes. Isolating such events from the background processes is crucial for both searches for new phenomena and precision measurements of known Standard Model processes.
The ATLAS collaboration has recently released its most precise luminosity measurement to date. They studied data taken over the course of four years (2015–2018), covering the entire Run 2 of the LHC, to assess the total amount of luminosity delivered to the ATLAS experiment in that dataset.
What exactly did this measurement entail? When proton beams circulate in the LHC, they are arranged in “bunches” each containing more than 100 billion protons. As two bunches circulating in opposite directions cross, some of the protons interact. Determining how many interactions there are in each bunch crossing provides a measure of the luminosity. Its value depends on the number of protons per bunch, how tightly squeezed the protons are and the angle at which the bunches cross. The luminosity also depends on the number of colliding proton bunches in each beam.
ATLAS has several detectors that are sensitive to the number of particles produced in proton–proton interactions, and the average number of measured particles is often proportional to the average number of proton–proton interactions per bunch crossing. Researchers can therefore use this average to monitor the “instantaneous” luminosity in real time during data-taking periods, and to measure the cumulative (“integrated”) luminosity over longer periods of time.
While ATLAS’s luminosity-sensitive detectors provided relative measurements of the luminosity during data taking, measurement of the absolute luminosity required a special LHC beam configuration that allows the detector signals to be calibrated. Once a year, the LHC proton beams are displaced from their normal position in order to record the particle counts in the luminosity detectors. This method is called a van der Meer beam separation scan, named after physics Nobel Prize winner Simon van der Meer, who developed the idea in the 1960s for application at CERN’s Intersecting Storage Rings. It allows researchers to estimate the size of the beam and measure how densely the protons are packed in the bunches. With that information in hand, they can calibrate the detector signals.
Working in close collaboration with ATLAS researchers, LHC experts carried out van der Meer scans under low-luminosity conditions, with an average of about 0.5 proton–proton interactions per bunch crossing and very long gaps between the bunches. For comparison, the LHC typically operates with 20–50 interactions per bunch crossing, and with bunches closer together in a “train” structure. The researchers therefore need to extrapolate the results of the van der Meer scans to the normal data-taking regime using the measurements from the luminosity-sensitive detectors.
Using this approach, and after careful evaluation of the systematic effects that can influence a luminosity measurement, ATLAS physicists determined the integrated luminosity of the full Run 2 dataset that had been recorded by ATLAS and certified as good for physics analysis, to be 140.1 ± 1.2 fb–1. For comparison, 1 inverse femtobarn (fb–1) corresponds to about 100 trillion proton–proton collisions. With its uncertainty of 0.83%, the result represents the most precise luminosity measurement at a hadron collider to date. It improves upon previous ATLAS measurements by a factor of 2 and is comparable with results achieved at the ISR experiments (0.9%).