Date of Award
Summer 2022
Document Type
Open Access Thesis
Department
Computer Science and Engineering
First Advisor
Ramtin Zand
Abstract
While many of the most exciting quantum computing algorithms are currently impossible to be implemented until fault-tolerant quantum error correction is achieved, noisy intermediate-scale quantum (NISQ) devices allow for smaller scale applications that leverage the paradigm for speed-ups to be researched and realized. A currently popular application for these devices is quantum machine learning (QML). Recent works over the past few years indicate that QML algorithms can function just as well as their classical counterparts, and even outperform them in some cases. Many current QML models take advantage of variational quantum algorithm (VQA) circuits, given that their scale is typically small enough to be compatible with NISQ devices and the method of automatic differentiation for optimizing circuit parameters is familiar to machine learning. As with many skeptics on its benefits of quantum computing in general, there is some concern as to whether machine learning is the "best" use case for the advantages that NISQ devices make possible. To this end, the nature of this work is to investigate the utilization of stochastic methods inspired by QML in attempt to approach the reported successes in performance. Using the long short-term memory (LSTM) model as a case study and by analyzing the performance of classical, stochastic, and QML methods, this work aims to elucidate if it is possible to achieve QML’s benefits on classical machines by incorporating aspects of its stochasticity.
Rights
© 2022, Joseph Lindsay
Recommended Citation
Lindsay, J.(2022). On Incorporating the Stochasticity of Quantum Machine Learning Into Classical Models. (Master's thesis). Retrieved from https://scholarcommons.sc.edu/etd/7014