Abstract
We show that a message-passing process allows us to store in binary “material” synapses a number of random patterns which almost saturate the information theoretic bounds. We apply the learning algorithm to networks characterized by a wide range of different connection topologies and of size comparable with that of biological systems (e.g., ). The algorithm can be turned into an online—fault tolerant—learning protocol of potential interest in modeling aspects of synaptic plasticity and in building neuromorphic devices.
- Received 8 November 2005
DOI:https://doi.org/10.1103/PhysRevLett.96.030201
©2006 American Physical Society