Abstract
Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible -point correlation is measured by a decrease in entropy for the joint distribution of variables relative to the maximum entropy allowed by all the observed variable distributions. We calculate the “connected information” terms for several examples and show that it also enables the decomposition of the information that is carried by a population of elements about an outside source.
- Received 15 July 2003
DOI:https://doi.org/10.1103/PhysRevLett.91.238701
©2003 American Physical Society