Content area

Abstract

The way artificial neural networks are trained with backpropagation requires a degree of synchronization of operations, and non-local knowledge of the computational graph of the network, which is infeasible in noisy asynchronous circuitry (be it biological, analog electronic or optical). Learning algorithms based on temporal or spatial neural activity differences allow estimating gradients, and hence learning, without these problematic requirements.

In this thesis, we explore a number of such alternative learning algorithms. Paper A presents a variation of contrastive Hebbian learning, which achieves Lipschitz-1 hidden layers by construction. Paper B focuses on efficient training on traditional digital hardware by presenting a variant of backpropagation compatible with quantized weights. Paper C returns to the topic of contrastive Hebbian learning by presenting a new local learning algorithm for training feedforward networks based on neurons possessing two internal states. These dyadic neurons perform credit assignment by encoding errors as differences and predictions as averages of the internal states. Paper D provides a new variation of dual propagation and provides derivations of both the original and the new variant. Paper E presents a general framework for dyadic learning,which encompasses dual propagation in feedforward models and equilibrium propagation (a well-known variant of contrastive Hebbian learning) on Hopfield models as special cases while also being applicable to arbitrarily connected networks. The case of a skew-symmetric Hopfield network is found to be particularly intriguing as it, like the model from paper A, provides Lipschitz-1 layers by construction.

Details

1010268
Business indexing term
Title
Local Learning Rules for Deep Neural Networks with Two-State Neurons
Number of pages
61
Publication year
2025
Degree date
2025
School code
0419
Source
DAI-B 86/10(E), Dissertation Abstracts International
ISBN
9798311933957
University/institution
Chalmers Tekniska Hogskola (Sweden)
University location
Sweden
Degree
Ph.D.
Source type
Dissertation or Thesis
Language
English
Document type
Dissertation/Thesis
Dissertation/thesis number
31957778
ProQuest document ID
3195715935
Document URL
https://www.proquest.com/dissertations-theses/local-learning-rules-deep-neural-networks-with/docview/3195715935/se-2?accountid=208611
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Database
ProQuest One Academic