Performance comparison of classical and quantum k-nearest neighbour algorithms
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Postgraduate Institute of Science (PGIS), University of Peradeniya, Sri Laka
Abstract
Quantum machine learning (QML) algorithms do not always outperform their classical counterparts. In the present study, the performances of Classical K-Nearest Neighbour (CKNN) and Quantum K-Nearest Neighbour (QKNN) algorithms for different data sets and distance functions are compared. Three data sets, different in size and dimensionality (number of variables), and three distance functions commonly used for CKNN and QKNN algorithms were considered. Iris data set has the smallest size with 150 observations and the lowest dimensionality, German credit data set has 1000 data and has medium dimensionality, and MNIST data set has 70,000 data and high dimensionality. The Euclidean, Mahalanobis, and Manhattan distance functions were considered due to their quantum counterparts having high compatibility with QML algorithms. Python libraries such as NumPy and Matplotlib on IBM Qiskit and Google Colab were employed for data analysis. For the Iris data set, CKNN achieved 100% accuracy for K=3 to 10 (Euclidean & Manhattan); 90% to 93.33% (Mahalanobis), and QKNN achieved 100% accuracy over all K values (1 to 10), and distance functions. For the German Credit data set, CKNN achieved 80% accuracy for K=1, 2 (Euclidean & Manhattan); stable but fluctuating with Mahalanobis and QKNN up to 75.76% accuracy for K= 6 (Euclidean & Manhattan), but more stable with Mahalanobis. For the MNIST data set, CKNN recorded moderate accuracy (~83%) for Euclidean & Manhattan; poor performance with Mahalanobis (<33%) and QKNN achieved 50% accuracy for Euclidean & Manhattan and failed with Mahalanobis (0-10%). QKNN demonstrated high accuracy levels for low-dimensional data sets but faced accuracy challenges for high-dimensional data due to Google Colab’s hardware limitations and simulated quantum noise. CKNN maintained a stable accuracy level through varying data dimensions and sizes, making it a reliable choice for machine learning tasks. QKNN shows promise, but its use depends on advancements in quantum computing.
Description
Citation
Proceedings International Conference on mathematics and Mathematics Education(ICMME) -2025, University of Peradeniya, P 5