Abstract
Restricted Boltzmann machines (RBMs) and deep Boltzmann machines (DBMs) are important models in machine learning, and recently found numerous applications in quantum many-body physics. We show that there are fundamental connections between them and tensor networks. In particular, we demonstrate that any RBM and DBM can be exactly represented as a two-dimensional tensor network. This representation gives characterizations of the expressive power of RBMs and DBMs using entanglement structures of the tensor networks, and also provides an efficient tensor network contraction algorithm for the computing partition function of RBMs and DBMs. Using numerical experiments, we show that the proposed algorithm is more accurate than the state-of-the-art machine learning methods in estimating the partition function of RBMs and DBMs, and have potential applications in training DBMs for general machine learning tasks.
5 More- Received 28 May 2021
- Revised 3 August 2021
- Accepted 10 August 2021
DOI:https://doi.org/10.1103/PhysRevB.104.075154
©2021 American Physical Society