Content area

Abstract

The “Completely Automated Public Turing test to Tell Computers and Human Apart” (CAPTCHA) is a standard security protocol, wildly used to distinguish between human and malicious computer program, known as bot. This paper presents a gesture-based CAPTCHA system i.e. GESTCHA utilizing angular velocity data from embedded gyroscope sensors of handheld touch-sensitive devices. The angular velocity data collected within a short exposure of time was processed as gesture input. Several discriminatory feature sets were extracted from stable gyroscope reading by applying a newly proposed gesture feature extraction algorithm. We analyzed the performance of two different machine learning algorithms, i.e. Naive Bayes and Random Forest, over the training gesture pattern which led to the development of a robust gesture recognition model. Using the model, final prototype of GESTCHA was proposed. Based on findings from a comparative usability study with 850 participants, GESTCHA shows significant improvement in terms of solving rate and solving time compared to Google’s reCAPTCHA v3 i.e. NoCAPTCHA.

Details

Title
GESTCHA: a gesture-based CAPTCHA design for smart devices using angular velocity
Author
Pritom, Ahmed Iqbal 1 ; Al Mashuk, Md. Abdullah 2 ; Ahmed, Somi 2 ; Monira, Nazifa 2 ; Islam, Md. Zahidul 2 

 Islamic University of Technology, Department of Computer Science and Engineering, Gazipur, Bangladesh (GRID:grid.443073.7) (ISNI:0000 0001 0582 2044); Green University of Bangladesh, Department of Computer Science and Engineering, Dhaka, Bangladesh (GRID:grid.443003.0) (ISNI:0000 0004 0582 9395) 
 Green University of Bangladesh, Department of Computer Science and Engineering, Dhaka, Bangladesh (GRID:grid.443003.0) (ISNI:0000 0004 0582 9395) 
Pages
521-549
Publication year
2023
Publication date
Jan 2023
Publisher
Springer Nature B.V.
ISSN
13807501
e-ISSN
15737721
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2758756062
Copyright
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022.