With the emergence of the research field Quantum Machine Learning, interest in finding advantageous real-world applications is growing as well.
However challenges concerning the number of available qubits on Noisy Intermediate Scale Quantum (NISQ) devices and accuracy losses due to hardware imperfections still remain and limit the applicability of such approaches in real-world scenarios.
For simplification, most studies therefore assume nearly noise-free conditions as they are expected with logical, i.e. error-corrected, qubits instead of real qubits provided by hardware.
However, the number of logical qubits is expected to scale slowly as they require a high number of real qubits for error correction.
This is our motivation to deal with noise as an unavoidable, non-negligible problem on NISQ devices.
Using the example of particle decay tree reconstruction as a highly complex combinatoric problem in High Energy Physics we investigate methods to reduce the noise impact of such devices and propose a hybrid architecture where a classical graph neural network is extended by a parameterized quantum circuit.
While we have shown that such a hybrid architecture enables a reduction of the amount of trainable parameters compared to the fully classical case, we are now specifically interested in the actual performance on real quantum devices.
Using simple synthetic Decay Trees, we train the network in classical simulations to allow for efficient optimization of the parameters.
The trained parameters are validated on NISQ devices by "IBM Quantum" and are used in interpretability and significance studies, enabling improvements in the accuracy on real devices.
In summary we improved the results of the existing architecture in terms of the validation performance, on real quantum devices.
|Consider for long presentation