Tensor-network discriminator architecture for classification of quantum data on quantum computers

Michael L. Wall, Paraj Titum, Gregory Quiroz, Michael Foss-Feig, and Kaden R. A. Hazzard
Phys. Rev. A 105, 062439 – Published 22 June 2022
PDFHTMLExport Citation

Abstract

We demonstrate the use of matrix product state (MPS) models for discriminating quantum data on quantum computers using holographic algorithms, focusing on the problem of classifying a translationally invariant quantum state based on L qubits of quantum data extracted from it. We detail a process in which data from single-shot experimental measurements are used to optimize an isometric tensor network, the isometric tensors are compiled into unitary quantum operations using greedy compilation heuristics, parameter optimization on the resulting quantum circuit model removes the postselection requirements of the isometric tensor model, and the resulting quantum model is inferenced on either product state (single-shot measurement) or entangled quantum data. We demonstrate our training and inference architecture on a synthetic dataset of six-site single-shot measurements from the bulk of a one-dimensional transverse field Ising model (TFIM) deep in its antiferromagnetic and paramagnetic phases. We find that increasing the bond dimension of the tensor-network model, amounting to adding more ancilla qubits to the circuit representation, improves both the average number of correct classifications across the dataset and the single-shot probability of correct classification. We experimentally evaluate models on Quantinuum's H1-2 trapped ion quantum computer using entangled input data modeled as translationally invariant, bond dimension 4 MPSs across the known quantum phase transition of the TFIM. Using linear regression on the experimental data near the transition point, we find predictions for the critical transverse field of h=0.962 and 0.994 for tensor-network discriminators of bond dimension χ=2 and χ=4, respectively. These predictions compare favorably with the known transition location of h=1 despite training on data far from the transition point. Our techniques identify families of short-depth variational quantum circuits in a data-driven and hardware-aware fashion and robust classical techniques to precondition the model parameters, and can be adapted beyond machine learning to myriad applications of tensor networks on quantum computers, such as quantum simulation and error correction.

  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
2 More
  • Received 24 February 2022
  • Accepted 7 June 2022

DOI:https://doi.org/10.1103/PhysRevA.105.062439

©2022 American Physical Society

Physics Subject Headings (PhySH)

Quantum Information, Science & TechnologyCondensed Matter, Materials & Applied PhysicsNetworks

Authors & Affiliations

Michael L. Wall*, Paraj Titum2, and Gregory Quiroz2

  • The Johns Hopkins University Applied Physics Laboratory, Laurel, Maryland 20723, USA

Michael Foss-Feig

  • Quantinuum, 303 S. Technology Ct., Broomfield, Colorado 80021, USA

Kaden R. A. Hazzard

  • Department of Physics and Astronomy, Rice University, Houston, Texas 77005-1892, USA and Rice Center for Quantum Materials, Rice University, Houston, Texas 77005-1892, USA

  • *Michael.Wall@jhuapl.edu

Article Text (Subscription Required)

Click to Expand

Supplemental Material (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 105, Iss. 6 — June 2022

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review A

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×