Content area
This paper presents a comprehensive mechanistic model of a neuron with plasticity that explains how information input as time-varying signals is processed and stored. Additionally, the model addresses two long-standing, specific biological challenges: Integrating Hebbian and homeostatic plasticity, and identifying a concise synaptic learning rule. A biologically accurate electric-circuit equivalent is derived through a one-to-one mapping from the known properties of ion channels. The often-overlooked dynamics of the synaptic cleft is essential in this process. Analysis of the model reveals a simple and succinct learning rule, indicating that the neuron functions as an internal-feedback adaptive filter, a common concept in signal processing. Simulations confirm the model's functionality, stability, and convergence, demonstrating that even a single neuron without external feedback can act as a potent signal processor. The model replicates several key characteristics typical of biological neurons, which are seldom captured in other neuron models. It can encode time-varying functions, learn without risking instability, and bootstrap from a state where all synaptic weights are zero. This paper explores the function of neurons with a focus on biological accuracy, not computational efficiency. Unlike neuromorphic models, it does not aim to design devices. The electronic circuit analogy aids understanding by leveraging decades of electronics expertise but is not intended for physical implementation. This interdisciplinary work spans a broad range of subjects within the realm of neurobiophysics, including neurobiology, electronics, and signal processing.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
* Added emphasis that this is not a neuromorphic paper.