Abstract

In contemporary muscle-computer interfaces for upper limb prosthetics there is often a trade-off between control robustness and range of executable movements. As a very low movement error rate is necessary in practical applications, this often results in a quite severe limitation of controllability; a problem growing ever more salient as the mechanical sophistication of multifunctional myoelectric prostheses continues to improve. A possible remedy for this could come from the use of multi-label machine learning methods, where complex movements can be expressed as the superposition of several simpler movements. Here, we investigate this claim by applying a multi-labeled classification scheme in the form of a deep convolutional neural network (CNN) to high density surface electromyography (HD-sEMG) recordings. We use 16 independent labels to model the movements of the hand and forearm state, representing its major degrees of freedom. By training the neural network on 16 × 8 sEMG image sequences 24 samples long with a sampling rate of 2048 Hz to detect these labels, we achieved a mean exact match rate of 78.7% and a mean Hamming loss of 2.9% across 14 healthy test subjects. With this, we demonstrate the feasibility of highly versatile and responsive sEMG control interfaces without loss of accuracy.

Details

Title
Extraction of Multi-Labelled Movement Information from the Raw HD-sEMG Image with Time-Domain Depth
Author
Olsson, Alexander E 1 ; Sager Paulina 1 ; Andersson Elin 1 ; Björkman, Anders 2 ; Malešević Nebojša 1   VIAFID ORCID Logo  ; Antfolk Christian 1 

 Lund University, Department of Biomedical Engineering, Faculty of Engineering, Lund, Sweden (GRID:grid.4514.4) (ISNI:0000 0001 0930 2361) 
 Skåne University Hospital, Department of Hand Surgery, Malmö, Sweden (GRID:grid.411843.b) (ISNI:0000 0004 0623 9987) 
Publication year
2019
Publication date
2019
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2222924841
Copyright
© The Author(s) 2019. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.