Full text

Turn on search term navigation

© 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

This work presents a new approach for an sEMG hand prosthesis based on a 3D printed model with a fully embedded computer vision system in a hybrid version. A modified 5-layer Smaller Visual Geometry Group (VGG) convolutional neural network, running on a Raspberry Pi 3 microcomputer connected to a webcam, recognizes the shape of daily use objects, and defines the pattern of the prosthetic grasp/gesture among five classes: Palmar Neutral, Palmar Pronated, Tripod Pinch, Key Grasp, and Index Finger Extension. Using the Myoware board and a finite state machine, the user’s intention, depicted by a myoelectric signal, starts the process, photographing the object, proceeding to the grasp/gesture classification, and commands the prosthetic motors to execute the movements. Keras software was used as an application programming interface and TensorFlow as numerical computing software. The proposed system obtained 99% accuracy, 97% sensitivity, and 99% specificity, showing that the computer vision system is a promising technology to assist the definition of the grasp pattern in prosthetic devices.

Details

Title
A Hybrid 3D Printed Hand Prosthesis Prototype Based on sEMG and a Fully Embedded Computer Vision System
Author
Castro, Maria Claudia F; Pinheiro, Wellington C; Rigolin, Glauco
Section
ORIGINAL RESEARCH article
Publication year
2022
Publication date
Jan 24, 2022
Publisher
Frontiers Research Foundation
e-ISSN
16625218
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2622376695
Copyright
© 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.