Full text

Turn on search term navigation

© 2020. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Selection of the time-window mainly affects the effectiveness of piecewise feature extraction procedures. We present an enhanced bag-of-patterns representation that allows capturing the higher-level structures of brain dynamics within a wide window range. So, we introduce augmented instance representations with extended window lengths for the short-time Common Spatial Pattern algorithm. Based on multiple-instance learning, the relevant bag-of-patterns are selected by a sparse regression to feed a bag classifier. The proposed higher-level structure representation promotes two contributions: \textit{i}) accuracy improvement of bi-conditional tasks, \textit{ii}) A better understanding of dynamic brain behavior through the learned sparse regression fits. Using a support vector machine classifier, the achieved performance on a public motor imagery dataset (left-hand and right-hand tasks) shows that the proposed framework performs very competitive results, providing robustness to the time variation of electroencephalography recordings and favoring the class separability.

Details

Title
Enhanced Multiple Instance Representation Using Time-Frequency Atoms in Motor Imagery Classification
Author
Collazos-Huertas, Diego; Caicedo-Acosta, Julian; Castaño-Duque, German A; Acosta-Medina, Carlos D
Section
Original Research ARTICLE
Publication year
2020
Publication date
Feb 25, 2020
Publisher
Frontiers Research Foundation
ISSN
16624548
e-ISSN
1662453X
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2365535724
Copyright
© 2020. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.