Storage capacity of networks with discrete synapses and sparsely encoded memories

Yu Feng and Nicolas Brunel
Phys. Rev. E 105, 054408 – Published 16 May 2022

Abstract

Attractor neural networks are one of the leading theoretical frameworks for the formation and retrieval of memories in networks of biological neurons. In this framework, a pattern imposed by external inputs to the network is said to be learned when this pattern becomes a fixed point attractor of the network dynamics. The storage capacity is the maximum number of patterns that can be learned by the network. In this paper, we study the storage capacity of fully connected and sparsely connected networks with a binarized Hebbian rule, for arbitrary coding levels. Our results show that a network with discrete synapses has a similar storage capacity as the model with continuous synapses, and that this capacity tends asymptotically towards the optimal capacity, in the space of all possible binary connectivity matrices, in the sparse coding limit. We also derive finite coding level corrections for the asymptotic solution in the sparse coding limit. The result indicates the capacity of networks with Hebbian learning rules converges to the optimal capacity extremely slowly when the coding level becomes small. Our results also show that in networks with sparse binary connectivity matrices, the information capacity per synapse is larger than in the fully connected case, and thus such networks store information more efficiently.

  • Figure
  • Figure
  • Figure
  • Figure
  • Received 13 December 2021
  • Accepted 11 March 2022

DOI:https://doi.org/10.1103/PhysRevE.105.054408

©2022 American Physical Society

Physics Subject Headings (PhySH)

  1. Research Areas
Physics of Living Systems

Authors & Affiliations

Yu Feng1,* and Nicolas Brunel1,2

  • 1Department of Physics, Duke University, Durham, North Carolina 27710, USA
  • 2Department of Neurobiology, Duke University, Durham, North Carolina 27710, USA

  • *Corresponding author: yu.feng707@duke.edu

Article Text (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 105, Iss. 5 — May 2022

Reuse & Permissions
Access Options
CHORUS

Article Available via CHORUS

Download Accepted Manuscript
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×