Abstract
Learning in a perceptron having a discrete weight space, where each weight can take different values, is examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined, among them on-line learning with discrete and continuous transfer functions. The generalization error of the clipped weights decays asymptotically as in the case of on-line learning with binary activation functions and in the case of on-line learning with continuous one, where is the number of examples divided by N, the size of the input vector and K is a positive constant. For finite N and L, perfect agreement between the discrete student and the teacher is obtained for A crossover to the generalization error characterizing continuous weights with binary output, is obtained for synaptic depth
- Received 5 February 2001
DOI:https://doi.org/10.1103/PhysRevE.64.046109
©2001 American Physical Society