Quantum Computing demonstrates potential exponential speedup over classical computing in a plethora of tasks including chemistry simulation, linear algebra, and large integer factorization. Machine learning is one such popular application that benefits from this advantage of quantum computers to facilitate speedup. However, due to the inherent noise in quantum computers, machine learning algorithms encounter problems relating to fidelity and accuracy. Existing research has addressed these issues pertaining to the unreliable execution of machine learning models in noisy quantum computers. In this paper, we explore the effects of noise in quantum machine learning and demonstrate approaches to mitigate this issue.
Liane-Marina MeßmerChristoph ReichDjaffar Ould Abdeslam
Lukasz CincioKenneth RudingerMohan SarovarPatrick J. Coles