Unsupervised and supervised learning:  Mutual information between parameters and observations

Didier Herschkowitz and Jean-Pierre Nadal
Phys. Rev. E 59, 3344 – Published 1 March 1999
PDFExport Citation

Abstract

We study the mutual information between parameter and data for a family of supervised and unsupervised learning tasks. The parameter is a possibly, but not necessarily, high-dimensional vector. We derive exact bounds and asymptotic behaviors for the mutual information as a function of the data size and of some properties of the probability of the data given the parameter. We compare these exact results with the predictions of replica calculations. We briefly discuss the universal properties of the mutual information as a function of data size.

  • Received 3 August 1998

DOI:https://doi.org/10.1103/PhysRevE.59.3344

©1999 American Physical Society

Authors & Affiliations

Didier Herschkowitz* and Jean-Pierre Nadal

  • Laboratoire de Physique Statistique de l’E.N.S., Ecole Normale Supérieure, 24, rue Lhomond-75231 Paris Cedex 05, France

  • *Electronic address: herschko@lps.ens.fr nadal@lps.ens.fr http://www.lps.ens.fr/∼risc/rescomp/
  • The laboratory is associated with the CNRS (URA 1306), ENS, and the Universities Paris VI and Paris VII.

References (Subscription Required)

Click to Expand
Issue

Vol. 59, Iss. 3 — March 1999

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×