Abstract

Spontaneous activity found in neural networks usually results in a reduction of computational performance. As a consequence, artificial neural networks are often operated at the edge of chaos, where the network is stable yet highly susceptible to input information. Surprisingly, regular spontaneous dynamics in Neural Networks beyond their resting state possess a high degree of spatio-temporal synchronization, a situation that can also be found in biological neural networks. Characterizing information preservation via complexity indices, we show how spatial synchronization allows rRNNs to reduce the negative impact of regular spontaneous dynamics on their computational performance.

Details

Title
Dynamical complexity and computation in recurrent neural networks beyond their fixed point
Author
Marquez, Bicky A 1 ; Larger, Laurent 1 ; Jacquot, Maxime 1 ; Chembo, Yanne K 2 ; Brunner, Daniel 1   VIAFID ORCID Logo 

 FEMTO-ST Institute, CNRS & Univ. Bourgogne Franche-Comté, 15B Avenue des Montboucons, Besançon Cedex, France 
 FEMTO-ST Institute, CNRS & Univ. Bourgogne Franche-Comté, 15B Avenue des Montboucons, Besançon Cedex, France; GeorgiaTech-CNRS Joint International Laboratory [UMI 2958], Atlanta Mirror Site, School of Electrical and Computer Engineering, 777 Atlantic Dr NW, Atlanta, GA, USA 
Pages
1-9
Publication year
2018
Publication date
Feb 2018
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2006814121
Copyright
© 2018. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.