Full text

Turn on search term navigation

Copyright © 2021 David Achanccaray et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/

Abstract

In the aging society, the number of people suffering from vascular disorders is rapidly increasing and has become a social problem. The death rate due to stroke, which is the second leading cause of global mortality, has increased by 40% in the last two decades. Stroke can also cause paralysis. Of late, brain-computer interfaces (BCIs) have been garnering attention in the rehabilitation field as assistive technology. A BCI for the motor rehabilitation of patients with paralysis promotes neural plasticity, when subjects perform motor imagery (MI). Feedback, such as visual and proprioceptive, influences brain rhythm modulation to contribute to MI learning and motor function restoration. Also, virtual reality (VR) can provide powerful graphical options to enhance feedback visualization. This work aimed to improve immersive VR-BCI based on hand MI, using visual-electrotactile stimulation feedback instead of visual feedback. The MI tasks include grasping, flexion/extension, and their random combination. Moreover, the subjects answered a system perception questionnaire after the experiments. The proposed system was evaluated with twenty able-bodied subjects. Visual-electrotactile feedback improved the mean classification accuracy for the grasping (93.00% ± 3.50%) and flexion/extension (95.00% ± 5.27%) MI tasks. Additionally, the subjects achieved an acceptable mean classification accuracy (maximum of 86.5% ± 5.80%) for the random MI task, which required more concentration. The proprioceptive feedback maintained lower mean power spectral density in all channels and higher attention levels than those of visual feedback during the test trials for the grasping and flexion/extension MI tasks. Also, this feedback generated greater relative power in the μ-band for the premotor cortex, which indicated better MI preparation. Thus, electrotactile stimulation along with visual feedback enhanced the immersive VR-BCI classification accuracy by 5.5% and 4.5% for the grasping and flexion/extension MI tasks, respectively, retained the subject’s attention, and eased MI better than visual feedback alone.

Details

Title
Visual-Electrotactile Stimulation Feedback to Improve Immersive Brain-Computer Interface Based on Hand Motor Imagery
Author
Achanccaray, David 1   VIAFID ORCID Logo  ; Izumi, Shin-Ichi 2   VIAFID ORCID Logo  ; Hayashibe, Mitsuhiro 1   VIAFID ORCID Logo 

 Neuro-Robotics Laboratory, Department of Robotics, Tohoku University, Sendai 980-8579, Japan 
 Department of Physical Medicine and Rehabilitation, Tohoku University Graduate School of Medicine, Sendai 980-8575, Japan; Graduate School of Biomedical Engineering, Tohoku University, Sendai 980-8574, Japan 
Editor
Abdelkader Nasreddine Belkacem
Publication year
2021
Publication date
2021
Publisher
John Wiley & Sons, Inc.
ISSN
16875265
e-ISSN
16875273
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2497884799
Copyright
Copyright © 2021 David Achanccaray et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/