Training a perceptron in a discrete weight space

Michal Rosen-Zvi and Ido Kanter
Phys. Rev. E 64, 046109 – Published 20 September 2001
PDFExport Citation

Abstract

Learning in a perceptron having a discrete weight space, where each weight can take 2L+1 different values, is examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined, among them on-line learning with discrete and continuous transfer functions. The generalization error of the clipped weights decays asymptotically as exp(Kα2) in the case of on-line learning with binary activation functions and exp(e|λ|α) in the case of on-line learning with continuous one, where α is the number of examples divided by N, the size of the input vector and K is a positive constant. For finite N and L, perfect agreement between the discrete student and the teacher is obtained for αLln(NL). A crossover to the generalization error 1/α, characterizing continuous weights with binary output, is obtained for synaptic depth L>O(N).

  • Received 5 February 2001

DOI:https://doi.org/10.1103/PhysRevE.64.046109

©2001 American Physical Society

Authors & Affiliations

Michal Rosen-Zvi and Ido Kanter

  • Minerva Center and the Department of Physics, Bar-Ilan University, Ramat-Gan 52900, Israel

References (Subscription Required)

Click to Expand
Issue

Vol. 64, Iss. 4 — October 2001

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×