• Open Access

Neural Canonical Transformation with Symplectic Flows

Shuo-Hui Li, Chen-Xiao Dong, Linfeng Zhang, and Lei Wang
Phys. Rev. X 10, 021020 – Published 28 April 2020

Abstract

Canonical transformation plays a fundamental role in simplifying and solving classical Hamiltonian systems. Intriguingly, it has a natural correspondence to normalizing flows with a symplectic constraint. Building on this key insight, we design a neural canonical transformation approach to automatically identify independent slow collective variables in general physical systems and natural datasets. We present an efficient implementation of symplectic neural coordinate transformations and two ways to train the model based either on the Hamiltonian function or phase-space samples. The learned model maps physical variables onto an independent representation where collective modes with different frequencies are separated, which can be useful for various downstream tasks such as compression, prediction, control, and sampling. We demonstrate the ability of this method first by analyzing toy problems and then by applying it to real-world problems, such as identifying and interpolating slow collective modes of the alanine dipeptide molecule and MNIST database images.

  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
2 More
  • Received 13 October 2019
  • Revised 22 January 2020
  • Accepted 9 March 2020

DOI:https://doi.org/10.1103/PhysRevX.10.021020

Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.

Published by the American Physical Society

Physics Subject Headings (PhySH)

Statistical Physics & ThermodynamicsNonlinear DynamicsPolymers & Soft Matter

Authors & Affiliations

Shuo-Hui Li1,2, Chen-Xiao Dong1,2, Linfeng Zhang3,*, and Lei Wang1,4,†

  • 1Institute of Physics, Chinese Academy of Sciences, Beijing 100190, China
  • 2University of Chinese Academy of Sciences, Beijing 100049, China
  • 3Program in Applied and Computational Mathematics, Princeton University, Princeton, New Jersey 08544, USA
  • 4Songshan Lake Materials Laboratory, Dongguan, Guangdong 523808, China

  • *linfengz@princeton.edu
  • wanglei@iphy.ac.cn

Popular Summary

For centuries, physicists and astronomers have tackled the dynamics of complex interacting systems (such as the Sun-Earth-Moon system) with a mathematical tool known as a canonical transformation. Such a transformation changes the coordinates of the Hamiltonian equations (which describe the time evolution of the system) to simplify computation while preserving the form of the equations. However, despite being a deep concept, its wider application has been limited by cumbersome manual inspection and manipulation. By exploring the inherent connection between canonical transformation and a modern machine-learning method known as the normalizing flow, we have constructed a neural canonical transformation that can be trained automatically using the Hamiltonian function or data.

Normalizing flows are adaptive transformations often implemented as deep neural networks, and they find many real-world applications such as speech synthesis, image generation, and so on. In essence, it is an invertible change of variables that deforms a complex probability distribution into a simpler one. The canonical transformations are normalizing flows, albeit with two crucial twists. First, they are flows in the phase space, which contains both coordinates and momenta. Second, these flows satisfy the symplectic condition, a mathematical property that underlies most intriguing features in classical mechanics.

An immediate application of the neural canonical transformation is to simplify complex dynamics toward independent nonlinear modes, thereby allowing one to identify a small number of slow modes that are essential for applications such as molecule dynamics and dynamical control. Meanwhile, our work also stands as an example of imposing physical principles into the design of deep neural networks for better modeling of natural data.

Key Image

Article Text

Click to Expand

References

Click to Expand
Issue

Vol. 10, Iss. 2 — April - June 2020

Subject Areas
Reuse & Permissions
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review X

Reuse & Permissions

It is not necessary to obtain permission to reuse this article or its components as it is available under the terms of the Creative Commons Attribution 4.0 International license. This license permits unrestricted use, distribution, and reproduction in any medium, provided attribution to the author(s) and the published article's title, journal citation, and DOI are maintained. Please note that some figures may have been included with permission from other third parties. It is your responsibility to obtain the proper permission from the rights holder directly for these figures.

×

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×