Abstract
We derive a well-defined renormalized version of mutual information that allows us to estimate the dependence between continuous random variables in the important case when one is deterministically dependent on the other. This is the situation relevant for feature extraction, where the goal is to produce a low-dimensional effective description of a high-dimensional system. Our approach enables the discovery of collective variables in physical systems, thus adding to the toolbox of artificial scientific discovery, while also aiding the analysis of information flow in artificial neural networks.
- Received 21 May 2020
- Revised 23 February 2021
- Accepted 2 April 2021
DOI:https://doi.org/10.1103/PhysRevLett.126.200601
Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI. Open access publication funded by the Max Planck Society.
Published by the American Physical Society