Eigenvalues of covariance matrices: Application to neural-network learning

Yann Le Cun, Ido Kanter, and Sara A. Solla
Phys. Rev. Lett. 66, 2396 – Published 6 May 1991
PDFExport Citation

Abstract

The learing time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the objective function in the space of coupling coefficients. The results are generic for symmetric matrices obtained by summing outer products of random vectors. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables.

  • Received 2 January 1991

DOI:https://doi.org/10.1103/PhysRevLett.66.2396

©1991 American Physical Society

Authors & Affiliations

Yann Le Cun, Ido Kanter, and Sara A. Solla

  • AT&T Bell Laboratories, Holmdel, New Jersey 07733 Department of Physics, Bar-Ilan University, Ramat-Gan, 52100, Israel

References (Subscription Required)

Click to Expand
Issue

Vol. 66, Iss. 18 — 6 May 1991

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review Letters

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×