Full text

Turn on search term navigation

Copyright © 2023 Amad Zafar et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/

Abstract

In this study, a channel and feature selection methodology is devised for brain-computer interface (BCI) applications using functional near-infrared spectroscopy (fNIRS). A graph convolutional network (GCN) is employed to select the appropriate and correlated fNIRS channels. Furthermore, in the feature extraction phase, the performance of two filter-based feature selection algorithms, (i) the minimum redundancy maximum relevance (mRMR) and (ii) ReliefF, is investigated. The five most commonly used temporal statistical features (i.e., mean, slope, maximum, skewness, and kurtosis) are used, whereas the conventional support vector machine (SVM) is utilized as a classifier for training and testing. The proposed methodology is validated using an available online dataset of motor imagery (left- and right-hand), mental arithmetic, and baseline tasks. First, the efficacy of the proposed methodology is shown for two-class BCI applications (i.e., left- vs. right-hand motor imagery and mental arithmetic vs. baseline). Second, the proposed framework is applied to four-class BCI applications (i.e., left- vs. right-hand motor imagery vs. mental arithmetic vs. baseline). The results show that the number of appropriate channels and features was significantly reduced, resulting in a significant increase in classification accuracy for both two-class and four-class BCI applications, respectively. Furthermore, both mRMR (i.e., 87.8% for motor imagery, 87.1% for mental arithmetic, and 78.7% for four-class) and ReliefF (i.e., 90.7% for motor imagery, 93.7% for mental arithmetic, and 81.6% for four-class) yielded high average classification accuracy p<0.05. However, the results of the ReliefF algorithm are more stable and significant.

Details

Title
A Hybrid GCN and Filter-Based Framework for Channel and Feature Selection: An fNIRS-BCI Study
Author
Amad Zafar 1   VIAFID ORCID Logo  ; Karam Dad Kallu 2   VIAFID ORCID Logo  ; M Atif Yaqub 3   VIAFID ORCID Logo  ; Muhammad Umair Ali 4   VIAFID ORCID Logo  ; Byun, Jong Hyuk 5   VIAFID ORCID Logo  ; Yoon, Min 6   VIAFID ORCID Logo  ; Kwang Su Kim 7   VIAFID ORCID Logo 

 Department of Intelligent Mechatronics Engineering, Sejong University, Seoul 05006, Republic of Korea 
 Department of Robotics & Artificial Intelligence (R&AI), School of Mechanical and Manufacturing Engineering (SMME), National University of Sciences and Technology (NUST), H−12, Islamabad 44000, Pakistan 
 ICFO-Institut de Ciències Fotòniques, The Barcelona Institute of Science and Technology, 08860 Cas-telldefels, Barcelona, Spain 
 Department of Unmanned Vehicle Engineering, Sejong University, Seoul 05006, Republic of Korea 
 Department of Mathematics, College of Natural Sciences, Pusan National University, Busan 46241, Republic of Korea 
 Department of Applied Mathematics, Pukyong National University, Busan, Republic of Korea 
 Department of Scientific Computing, Pukyong National University, Busan, Republic of Korea; Interdisciplinary Biology Laboratory (iBLab), Division of Biological Science, Graduate School of Science, Nagoya University, Nagoya, Japan 
Editor
Vittorio Memmolo
Publication year
2023
Publication date
2023
Publisher
John Wiley & Sons, Inc.
ISSN
08848173
e-ISSN
1098111X
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2802484141
Copyright
Copyright © 2023 Amad Zafar et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/