Content area

Abstract

Recent advances in neurotechnology have enabled the simultaneous recording of large-scale neural activity and behavioral data, opening opportunities to elucidate the neural mechanisms underlying behavior and to improve brain-computer interfaces (BCIs). Establishing an association between neural and behavioral recordings plays a pivotal role in accomplishing these opportunities. However, both neural and behavioral recordings are inherently high-dimensional, posing challenges to directly establish the association. Alternatively, projecting these high-dimensional recordings onto structured latent spaces can be effective in identifying underlying neural and behavioral relationships and bridging the gap between the two.

Algorithms that project neural and behavioral recordings onto lower-dimensional representations have been proposed. For instance, latent variable models (LVMs) have been introduced to transform neural data into interpretable low-dimensional representations for analysis. While effective, these methods often overlook the sequential and causal properties inherent in neural activity. Meanwhile, approaches for automatic behavior understanding typically depend on extensive human annotation to achieve high precision. The annotation procedure is often labor-intensive and subjective, limiting their scalability and consistency. To overcome these limitations, this thesis proposes approaches that enable a more biologically realistic and efficient analysis of neural and behavioral recordings.

Specifically, we introduce a neural representation learning approach explicitly incorporating temporal causality. In this framework, representation learning is formulated to estimate future neural activity solely from its past. We additionally included a graphical prior to model pairwise spatial interactions among neural recording channels. Experiments on synthetic and actual neural datasets demonstrate that this method can enhance the estimation of future neural dynamics, recover the underlying spatial interactions, and align neural trajectories with behavioral states. In parallel, for efficient behavioral state discovery, we developed an active learning-based, semi-supervised approach for behavioral state discovery and classification, significantly reducing the annotation burden while maintaining high accuracy. As a result, we introduce OpenLabCluster, an open-source toolkit designed to identify behavior categories across diverse species to democratize these behavioral techniques as a convenient user interface. Finally, we propose algorithms to bridge neural representations and behavioral states. With an application to brain-to-text decoding, we show enhanced accuracy in decoding neural signals into phonemic and textual outputs by leveraging refined behavioral states. This algorithm underscores the importance of using relevant and precise behavioral states as additional guidance to enhance brain decoding fidelity.

Overall, these contributions facilitate the understanding of neural-behavior relationships and improve the precision of brain-computer interfaces, potentially paving the way for neural computation discovery and the development of high-performance BCIs.

Details

1010268
Title
Learning Neural Representations Compatible With Behavior Interpretation
Number of pages
161
Publication year
2025
Degree date
2025
School code
0250
Source
DAI-B 86/10(E), Dissertation Abstracts International
ISBN
9798310396869
Committee member
Orsborn, Amy; Rao, Rajesh; Iyer, Vikram
University/institution
University of Washington
Department
Electrical and Computer Engineering
University location
United States -- Washington
Degree
Ph.D.
Source type
Dissertation or Thesis
Language
English
Document type
Dissertation/Thesis
Dissertation/thesis number
31844666
ProQuest document ID
3193535061
Document URL
https://www.proquest.com/dissertations-theses/learning-neural-representations-compatible-with/docview/3193535061/se-2?accountid=208611
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Database
ProQuest One Academic