Trainability of Dissipative Perceptron-Based Quantum Neural Networks

Kunal Sharma, M. Cerezo, Lukasz Cincio, and Patrick J. Coles
Phys. Rev. Lett. 128, 180505 – Published 6 May 2022
PDFHTMLExport Citation

Abstract

Several architectures have been proposed for quantum neural networks (QNNs), with the goal of efficiently performing machine learning tasks on quantum data. Rigorous scaling results are urgently needed for specific QNN constructions to understand which, if any, will be trainable at a large scale. Here, we analyze the gradient scaling (and hence the trainability) for a recently proposed architecture that we call dissipative QNNs (DQNNs), where the input qubits of each layer are discarded at the layer’s output. We find that DQNNs can exhibit barren plateaus, i.e., gradients that vanish exponentially in the number of qubits. Moreover, we provide quantitative bounds on the scaling of the gradient for DQNNs under different conditions, such as different cost functions and circuit depths, and show that trainability is not always guaranteed. Our work represents the first rigorous analysis of the scalability of a perceptron-based QNN.

  • Figure
  • Figure
  • Figure
  • Received 31 July 2020
  • Revised 22 September 2021
  • Accepted 6 April 2022

DOI:https://doi.org/10.1103/PhysRevLett.128.180505

© 2022 American Physical Society

Physics Subject Headings (PhySH)

Quantum Information, Science & TechnologyNetworks

Authors & Affiliations

Kunal Sharma1,2,*, M. Cerezo1,3,*, Lukasz Cincio1, and Patrick J. Coles1

  • 1Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA
  • 2Hearne Institute for Theoretical Physics and Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA
  • 3Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA

  • *K. S. and M. C. contributed equally to this work.

Article Text (Subscription Required)

Click to Expand

Supplemental Material (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 128, Iss. 18 — 6 May 2022

Reuse & Permissions
Access Options
CHORUS

Article Available via CHORUS

Download Accepted Manuscript
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review Letters

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×