Content area

Abstract

Memories are stored and retained through complex, coupled processes operating on multiple timescales. To understand the computational principles behind these intricate networks of interactions, we construct a broad class of synaptic models that efficiently harness biological complexity to preserve numerous memories by protecting them against the adverse effects of overwriting. The memory capacity scales almost linearly with the number of synapses, which is a substantial improvement over the square root scaling of previous models. This was achieved by combining multiple dynamical processes that initially store memories in fast variables and then progressively transfer them to slower variables. Notably, the interactions between fast and slow variables are bidirectional. The proposed models are robust to parameter perturbations and can explain several properties of biological memory, including delayed expression of synaptic modifications, metaplasticity, and spacing effects.

Details

Title
Computational principles of synaptic memory consolidation
Author
Benna, Marcus K; Fusi, Stefano
Pages
1697-1706
Publication year
2016
Publication date
Dec 2016
Publisher
Nature Publishing Group
ISSN
10976256
e-ISSN
15461726
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
1855418478
Copyright
Copyright Nature Publishing Group Dec 2016