Full text

Turn on search term navigation

© 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

As an important component to promote the development of affective brain-computer interfaces, the study of emotion recognition based on electroencephalography (EEG) has encountered a difficult challenge; the distribution of EEG data changes among different subjects and at different time periods. Domain adaptation methods can effectively alleviate the generalization problem of EEG emotion recognition models. However, most of them treat multiple source domains, with significantly different distributions, as one single source domain, and only adapt the cross-domain marginal distribution while ignoring the joint distribution difference between the domains. To gain the advantages of multiple source distributions, and better match the distributions of the source and target domains, this paper proposes a novel multi-source joint domain adaptation (MSJDA) network. We first map all domains to a shared feature space, and then align the joint distributions of the further extracted private representations and the corresponding classification predictions for each pair of source and target domains. Extensive cross-subject and cross-session experiments on the benchmark dataset, SEED, demonstrate the effectiveness of the proposed model, where more significant classification results are obtained on the more difficult cross-subject emotion recognition task.

Details

Title
Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
Author
Liang, Shengjin; Su, Lei; Fu, Yunfa; Wu, Liping
Section
ORIGINAL RESEARCH article
Publication year
2022
Publication date
Sep 15, 2022
Publisher
Frontiers Research Foundation
e-ISSN
16625161
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2714784312
Copyright
© 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.