Treffer: Implementation and empirical evaluation of a quantum machine learning pipeline for local classification.
Phys Rev Lett. 2015 Apr 10;114(14):140504. (PMID: 25910101)
Phys Rev Lett. 2014 Sep 26;113(13):130503. (PMID: 25302877)
Nature. 2017 Sep 13;549(7671):195-202. (PMID: 28905917)
Phys Rev Lett. 2001 Oct 15;87(16):167902. (PMID: 11690244)
Phys Rev Lett. 2016 Sep 23;117(13):130501. (PMID: 27715099)
BMC Cancer. 2018 Jan 4;18(1):29. (PMID: 29301500)
Nature. 2019 Mar;567(7747):209-212. (PMID: 30867609)
Phys Rev Lett. 2008 Apr 25;100(16):160501. (PMID: 18518173)
Weitere Informationen
In the current era, quantum resources are extremely limited, and this makes difficult the usage of quantum machine learning (QML) models. Concerning the supervised tasks, a viable approach is the introduction of a quantum locality technique, which allows the models to focus only on the neighborhood of the considered element. A well-known locality technique is the k-nearest neighbors (k-NN) algorithm, of which several quantum variants have been proposed; nevertheless, they have not been employed yet as a preliminary step of other QML models. Instead, for the classical counterpart, a performance enhancement with respect to the base models has already been proven. In this paper, we propose and evaluate the idea of exploiting a quantum locality technique to reduce the size and improve the performance of QML models. In detail, we provide (i) an implementation in Python of a QML pipeline for local classification and (ii) its extensive empirical evaluation. Regarding the quantum pipeline, it has been developed using Qiskit, and it consists of a quantum k-NN and a quantum binary classifier, both already available in the literature. The results have shown the quantum pipeline's equivalence (in terms of accuracy) to its classical counterpart in the ideal case, the validity of locality's application to the QML realm, but also the strong sensitivity of the chosen quantum k-NN to probability fluctuations and the better performance of classical baseline methods like the random forest.
(Copyright: © 2023 Zardini et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.)
The authors have declared that no competing interests exist.