Abstract
We investigate the Vapnik-Chervonenkis (VC) dimension of the perceptron and simple two-layer networks such as the committee and the parity machine with weights restricted to values ±1. For binary inputs, the VC dimension is determined by atypical pattern sets, i.e., it cannot be found by replica analysis or numerical Monte Carlo sampling. For small systems, exhaustive enumerations yield exact results. For systems that are too large for enumerations, number theoretic arguments give lower bounds for the VC dimension. For the Ising perceptron, the VC dimension is probably larger than N/2.
- Received 11 September 1996
DOI:https://doi.org/10.1103/PhysRevE.55.4478
©1997 American Physical Society