Abstract

Motor imagery electroencephalography (EEG) analysis is crucial for the development of effective brain-computer interfaces (BCIs), yet it presents considerable challenges due to the complexity of the data and inter-subject variability. This paper introduces EEGCCT, an application of compact convolutional transformers designed specifically to improve the analysis of motor imagery tasks in EEG. Unlike traditional approaches, EEGCCT model significantly enhances generalization from limited data, effectively addressing a common limitation in EEG datasets. We validate and test our models using the open-source BCI Competition IV datasets 2a and 2b, employing a Leave-One-Subject-Out (LOSO) strategy to ensure subject-independent performance. Our findings demonstrate that EEGCCT not only outperforms conventional models like EEGNet in standard evaluations but also achieves better performance compared to other advanced models such as Conformer, Hybrid s-CViT, and Hybrid t-CViT, while utilizing fewer parameters and achieving an accuracy of 70.12%. Additionally, the paper presents a comprehensive ablation study that includes targeted data augmentation, hyperparameter optimization, and architectural improvements.

Details

Title
Compact convolutional transformer for subject-independent motor imagery EEG-based BCIs
Author
Keutayeva, Aigerim 1 ; Fakhrutdinov, Nail 2 ; Abibullaev, Berdakh 3 

 Nazarbayev University, Institute of Smart Systems and Artificial Intelligence (ISSAI), Astana, Kazakhstan (GRID:grid.428191.7) (ISNI:0000 0004 0495 7803) 
 Nazarbayev University, Department of Computer Science, Astana, Kazakhstan (GRID:grid.428191.7) (ISNI:0000 0004 0495 7803) 
 Nazarbayev University, Department of Robotics Engineering, Astana, Kazakhstan (GRID:grid.428191.7) (ISNI:0000 0004 0495 7803) 
Pages
25775
Publication year
2024
Publication date
2024
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3121470142
Copyright
© The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.