Treffer: Adaptive Grover-driven optimization for quantum-inspired deep learning: A gradient-free training framework.

Title:
Adaptive Grover-driven optimization for quantum-inspired deep learning: A gradient-free training framework.
Source:
AIMS Mathematics; 2025, Vol. 10 Issue 11, p1-25, 25p
Database:
Complementary Index

Weitere Informationen

Training deep neural networks remains difficult due to vanishing gradients, non-convex loss surfaces, and hyperparameter sensitivity. These obstacles are compounded by quantum machine learning, where barren plateaus, circuit depth, and hardware noise restrict the applicability of gradient-based approaches. To overcome these drawbacks, this study presents adaptive Grover-driven parallel quantum optimization (AG-PQO), a hybrid, gradient-free scheme that leverages Grover's quadratic search speedup, along with adaptive loss-aware discretization and fidelity-based regularization. In contrast to more classical optimizers, such as Adam or evolutionary strategies (ES), which are either sensitive to the adequacy of the gradient update or exhibit poor scaling behavior, AG-PQO optimizes by performing Grover-accelerated candidate exploration across layers and reuses high-quality solutions in quantum memory caching. Testing indicates that AG-PQO yields higher accuracy, 2%–3% above Adam and ES, and faster convergence with less end-value loss than Adam, ES, and quantum feedforward-backpropagation (QFB). It is worth noting that AG-PQO remains stable at the simulated noise level of NISQ and has the potential to scale to near-term quantum processors. [ABSTRACT FROM AUTHOR]

Copyright of AIMS Mathematics is the property of American Institute of Mathematical Sciences and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)