An International Team Led by the Institute for Corpuscular Physics Develops an Algorithm for More Accurate Representations of Collisions in Accelerators Like the LHC

An international team led by researchers from the Institute for Corpuscular Physics (IFIC), a joint center of the Spanish National Research Council (CSIC) and the University of Valencia, has developed an algorithm that enhances the accuracy of predictions regarding the behavior of elementary particles in accelerators like CERN’s Large Hadron Collider (LHC). This new method is based on quantum vacuum fluctuations—an intriguing phenomenon in physics that paradoxically enables more precise mathematical representations of physical processes. The method, published in the prestigious journal Physical Review Letters, has been implemented for the first time on a quantum computer, a breakthrough detailed in another article published in Quantum Science and Technology.

In quantum physics, the vacuum is a concept as fascinating as it is perplexing. Far from being an empty space devoid of content, it is a dynamic arena where particles and antiparticles constantly emerge and annihilate, guided by Heisenberg’s uncertainty principle. These quantum vacuum fluctuations, though fleeting, leave an indelible mark that significantly improves theoretical predictions about the behavior of subatomic particles—an essential aspect for interpreting data from experiments such as those conducted at the LHC.

Traditionally, theoretical models predicting this behavior have relied on diagrams introduced by Nobel laureate Richard Feynman, which graphically and concisely depict interactions between a set of initially colliding particles and those that emerge as a result. However, the mathematical formalism employed in these models can, in certain cases, allow for the production of some of these particles with exactly zero energy or in the same direction.

Although these configurations are mathematically valid, they lack physical meaning. This phenomenon highlights a fundamental feature of quantum mechanics: the number of particles is not fixed and can change due to quantum fluctuations. This complexity introduces significant theoretical challenges, often leading to mathematical infinities that hinder precise results.

The IFIC-led research proposes an innovative approach: basing theoretical calculations on vacuum amplitudes—that is, diagrams that exclude external particles and focus on the intrinsic fluctuations of the quantum vacuum. This strategy eliminates the difficulties associated with infinite values and provides more accurate mathematical representations of real physical processes.

As Germán Rodrigo, principal investigator of the LHCPHENO group at IFIC and leader of the study, explains: “When a mathematical formalism leads to unnecessary complications, it is often a sign that a more elegant and direct approach exists to obtain the result. The method we have developed explicitly incorporates the fundamental physical principle of causality, or cause and effect. In addition to enabling more advanced theoretical predictions, it offers a new perspective on understanding the enigmatic quantum properties of the vacuum,” says the CSIC physicist.

Applications in Quantum Computing

The absence of infinities, combined with the intrinsic quantum nature of particle physics, has allowed researchers to successfully implement their new algorithm on a quantum computer. This milestone has facilitated the first-ever prediction of the Higgs boson decay rate on such platforms at second order in quantum field theory—a theoretical framework that merges quantum mechanics and special relativity to describe how elementary particles interact.

This represents a significant breakthrough, as high-order calculations in quantum field theory—where each new order significantly refines system descriptions—are extremely complex and require substantial computational power. Achieving this result on a quantum computer not only validates its capability to tackle advanced theoretical physics problems but also opens new possibilities for applying quantum computing to elementary particle simulations and other high-energy physics applications.

Jorge Martínez de Lejarza, a PhD student at IFIC and one of the authors of the latest study, states: “Quantum computers promise to revolutionize computing in the 21st century, surpassing classical computers in solving certain specific problems. In particle physics, we face some of the greatest scientific challenges, and our mission is to reformulate them to enable execution on quantum computers, thereby contributing to a deeper understanding of the universe.”

This advancement paves the way for new applications in quantum computing and marks a significant step forward in exploring the frontiers of particle physics. The research was conducted in collaboration with researchers from the University of Salamanca, the Autonomous University of Sinaloa (Mexico), and CERN’s Quantum Technologies Initiative.

References:

S. Ramírez-Uribe, P.K. Dhani, G.F.R. Sborlini, and G. Rodrigo, Rewording Theoretical Predictions at Colliders with Vacuum Amplitudes, Phys. Rev. Lett. 133 (2024) 211901. DOI: 10.1103/PhysRevLett.133.211901

J.J. Martínez de Lejarza, D.F. Rentería-Estrada, M. Grossi, and G. Rodrigo, Quantum Integration of Decay Rates at Second Order in Perturbation Theory, Quantum Sci. Technol. 10 (2025) 2, 025026. DOI: 10.1088/2058-9565/ada9c5