Author
Listed:
- Alexis M Dubreuil
- Yali Amit
- Nicolas Brunel
Abstract
In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level , in the large and sparse coding limits (). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.Author Summary: Two central hypotheses in neuroscience are that long-term memory is sustained by modifications of the connectivity of neural circuits, while short-term memory is sustained by persistent neuronal activity following the presentation of a stimulus. These two hypotheses have been substantiated by several decades of electrophysiological experiments, reporting activity-dependent changes in synaptic connectivity in vitro, and stimulus-selective persistent neuronal activity in delayed response tasks in behaving monkeys. They have been implemented in attractor network models, that store specific patterns of activity using Hebbian plasticity rules, which then allow retrieval of these patterns as attractors of the network dynamics. A long-standing question in the field is how many patterns (or equivalently, how much information) can be stored in such networks? Here, we compute the storage capacity of networks of binary neurons and binary synapses. Synapses store information according to a simple stochastic learning process that consists of transitions between synaptic states conditioned on the states of pre- and post-synaptic neurons. We consider this learning process in two limits: a one shot learning scenario, where each pattern is presented only once, and a slow learning scenario, where noisy versions of a set of patterns are presented multiple times, but transition probabilities are small. The two limits are assumed to represent, in a simplified way, learning in the hippocampus and neocortex, respectively. We show that in both cases, the information stored per synapse remains finite in the large limit, when the coding is sparse. Furthermore, we characterize the strong finite size effects that exist in such networks.
Suggested Citation
Alexis M Dubreuil & Yali Amit & Nicolas Brunel, 2014.
"Memory Capacity of Networks with Stochastic Binary Synapses,"
PLOS Computational Biology, Public Library of Science, vol. 10(8), pages 1-15, August.
Handle:
RePEc:plo:pcbi00:1003727
DOI: 10.1371/journal.pcbi.1003727
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1003727. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.