Memory capacity of networks with stochastic binary synapses.

Loading...
Thumbnail Image

Date

2014-08

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

117
views
109
downloads

Citation Stats

Abstract

In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level [Formula: see text], in the large [Formula: see text] and sparse coding limits ([Formula: see text]). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.

Department

Description

Provenance

Citation

Published Version (Please cite this version)

10.1371/journal.pcbi.1003727

Publication Info

Dubreuil, Alexis M, Yali Amit and Nicolas Brunel (2014). Memory capacity of networks with stochastic binary synapses. PLoS Comput Biol, 10(8). p. e1003727. 10.1371/journal.pcbi.1003727 Retrieved from https://hdl.handle.net/10161/15122.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Brunel

Nicolas Brunel

Duke School of Medicine Distinguished Professor in Neuroscience

We use theoretical models of brain systems to investigate how they process and learn information from their inputs. Our current work focuses on the mechanisms of learning and memory, from the synapse to the network level, in collaboration with various experimental groups. Using methods from
statistical physics, we have shown recently that the synaptic
connectivity of a network that maximizes storage capacity reproduces
two key experimentally observed features: low connection probability
and strong overrepresentation of bidirectionnally connected pairs of
neurons. We have also inferred `synaptic plasticity rules' (a
mathematical description of how synaptic strength depends on the
activity of pre and post-synaptic neurons) from data, and shown that
networks endowed with a plasticity rule inferred from data have a
storage capacity that is close to the optimal bound.



Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.