Full text

Turn on search term navigation

© 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

In recent years, gesture recognition based on surface electromyography (sEMG) signals have been extensively studied. However, the accuracy and stability of gesture recognition through traditional machine learning algorithms are still insufficient to some actual application scenarios. To enhance this situation, this paper proposed a method combining feature selection and Ensemble Extreme Learning Machine (EELM) to improve the recognition performance based on sEMG signals. Firstly, the input sEMG signals are preprocessed and 16 features are then extracted from each channel. Next, features that mostly contribute to the gesture recognition are selected from the extracted features using the recursive feature elimination (RFE) algorithm. Then, several independent ELM base classifiers are established using the selected features. Finally, the recognition results are determined by integrating the results obtained by ELM base classifiers using the majority voting method. The Ninapro DB5 dataset containing 52 different hand movements captured from ten able-bodied subjects was used to evaluate the performance of the proposed method. The results showed that the proposed method could perform the best (overall average accuracy 80.8%) compared with decision tree (DT), ELM and random forest (RF) methods.

Details

Title
Gesture Recognition by Ensemble Extreme Learning Machine Based on Surface Electromyography Signals
Author
Peng, Fulai; Chen, Cai; Lv, Danyang; Zhang, Ningling; Wang, Xingwei; Zhang, Xikun; Wang, Zhiyong
Section
ORIGINAL RESEARCH article
Publication year
2022
Publication date
Jun 16, 2022
Publisher
Frontiers Research Foundation
e-ISSN
16625161
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2677211710
Copyright
© 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.