Full Text

Turn on search term navigation

© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Background: Due to complex signal characteristics and distinct individual differences, the decoding of a motor imagery electroencephalogram (MI-EEG) is limited by the unsatisfactory performance of suboptimal traditional models. Methods: A subject-independent model named MSEI-ENet is proposed for multiple-task MI-EEG decoding. It employs a specially designed multi-scale structure EEG-inception module (MSEI) for comprehensive feature learning. The encoder module further helps to detect discriminative information by its multi-head self-attention layer with a larger receptive field, which enhances feature representation and improves recognition efficacy. Results: The experimental results on Competition IV dataset 2a showed that our proposed model yielded an overall accuracy of 94.30%, MF1 score of 94.31%, and Kappa of 0.92. Conclusions: A performance comparison with state-of-the-art methods demonstrated the effectiveness and generalizability of the proposed model on challenging multi-task MI-EEG decoding.

Details

Title
MSEI-ENet: A Multi-Scale EEG-Inception Integrated Encoder Network for Motor Imagery EEG Decoding
Author
Wu, Pengcheng; Keling Fei; Chen, Baohong; Pan, Lizheng
First page
129
Publication year
2025
Publication date
2025
Publisher
MDPI AG
e-ISSN
20763425
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3170928120
Copyright
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.