Abstract

As robots have become more pervasive in our daily life, natural human-robot interaction (HRI) has had a positive impact on the development of robotics. Thus, there has been growing interest in the development of vision-based hand gesture recognition for HRI to bridge human-robot barriers. The aim is for interaction with robots to be as natural as that between individuals. Accordingly, incorporating hand gestures in HRI is a significant research area. Hand gestures can provide natural, intuitive, and creative methods for communicating with robots. This paper provides an analysis of hand gesture recognition using both monocular cameras and RGB-D cameras for this purpose. Specifically, the main process of visual gesture recognition includes data acquisition, hand gesture detection and segmentation, feature extraction and gesture classification, which are discussed in this paper. Experimental evaluations are also reviewed. Furthermore, algorithms of hand gesture recognition for human-robot interaction are examined in this study. In addition, the advances required for improvement in the present hand gesture recognition systems, which can be applied for effective and efficient human-robot interaction, are discussed.

Details

Title
Computer vision-based hand gesture recognition for human-robot interaction: a review
Author
Qi, Jing 1 ; Ma, Li 1 ; Cui, Zhenchao 1 ; Yu, Yushu 2 

 Hebei University, School of Cyber Security and Computer, Baoding, China (GRID:grid.256885.4) (ISNI:0000 0004 1791 4722); Hebei University, Machine Vision Engineering Research Center of Hebei Province, Baoding, China (GRID:grid.256885.4) (ISNI:0000 0004 1791 4722) 
 Beijing Institute of Technology, School of Mechatronical Engineering, Beijing, China (GRID:grid.43555.32) (ISNI:0000 0000 8841 6246) 
Pages
1581-1606
Publication year
2024
Publication date
Feb 2024
Publisher
Springer Nature B.V.
ISSN
21994536
e-ISSN
21986053
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2924576399
Copyright
© The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.