Publication
Solving a Higgs Optimization Problem with Quantum Annealing for Machine Learning
Alex Mott, et al.
The discovery of Higgs-boson decays in a background of standard-model processes was assisted by machine learning methods. The classifiers used to separate signals such as these from the background are trained using highly unerring but not completely perfect simulations of the physical processes involved, often resulting in incorrect labeling of background processes or signals (label noise) and systematic errors. Here we use quantum and classical, annealing (probabilistic techniques for approximating the global maximum or minimum of a given function) to solve a Higgs-signal-versus-background machine learning optimization problem, mapped to a problem of finding the ground state of a corresponding Ising spin model.