Full text

Turn on search term navigation

Copyright © 2022 Guobin Wan et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/

Abstract

Traditional training methods such as card teaching, assistive technologies (e.g., augmented reality/virtual reality games and smartphone apps), DVDs, human-computer interactions, and human-robot interactions are widely applied in autistic rehabilitation training in recent years. In this article, we propose a novel framework for human-computer/robot interaction and introduce a preliminary intervention study for improving the emotion recognition of Chinese children with an autism spectrum disorder. The core of the framework is the Facial Emotion Cognition and Training System (FECTS, including six tasks to train children with ASD to match, infer, and imitate the facial expressions of happiness, sadness, fear, and anger) based on Simon Baron-Cohen’s E-S (empathizing-systemizing) theory. Our system may be implemented on PCs, smartphones, mobile devices such as PADs, and robots. The training record (e.g., a tracked record of emotion imitation) of the Chinese autistic children interacting with the device implemented using our FECTS will be uploaded and stored in the database of a cloud-based evaluation system. Therapists and parents can access the analysis of the emotion learning progress of these autistic children using the cloud-based evaluation system. Deep-learning algorithms of facial expressions recognition and attention analysis will be deployed in the back end (e.g., devices such as a PC, a robotic system, or a cloud system) implementing our FECTS, which can perform real-time tracking of the imitation quality and attention of the autistic children during the expression imitation phase. In this preliminary clinical study, a total of 10 Chinese autistic children aged 3–8 are recruited, and each of them received a single 20-minute training session every day for four consecutive days. Our preliminary results validated the feasibility of the developed FECTS and the effectiveness of our algorithms based on Chinese children with an autism spectrum disorder. To verify that our FECTS can be further adapted to children from other countries, children with different cultural/sociological/linguistic contexts should be recruited in future studies.

Details

Title
FECTS: A Facial Emotion Cognition and Training System for Chinese Children with Autism Spectrum Disorder
Author
Wan, Guobin 1   VIAFID ORCID Logo  ; Deng, Fuhao 2   VIAFID ORCID Logo  ; Jiang, Zijian 2   VIAFID ORCID Logo  ; Song, Sifan 3   VIAFID ORCID Logo  ; Hu, Di 4   VIAFID ORCID Logo  ; Chen, Lifu 5   VIAFID ORCID Logo  ; Wang, Haibo 6   VIAFID ORCID Logo  ; Li, Miaochun 7   VIAFID ORCID Logo  ; Chen, Gong 8   VIAFID ORCID Logo  ; Ting, Yan 9   VIAFID ORCID Logo  ; Su, Jionglong 10   VIAFID ORCID Logo  ; Zhang, Jiaming 11   VIAFID ORCID Logo 

 Shenzhen Maternal and Child Health Hospital, Shenzhen 518000, China 
 Shenzhen Institute of Artificial Intelligence and Robotics for Society, Shenzhen 518172, China 
 Department of Mathematical Sciences, Xi’an Jiaotong-Liverpool University, Suzhou 215123, China 
 University of Maryland, Robert H.Smith School of Business, College Park, MA, USA 
 DoGoodly International Education Center (Shenzhen) Co., Ltd. and Smart Children Education Center, Shenzhen 518219, China 
 School of Information and Control Engineering, China University of Mining and Technology, Xuzhou 221116, China 
 Department of Information Management and Information System, Guangdong Pharmaceutical University, Zhongshan 511436, China 
 Sunwoda Electronic Co., Ltd, Shiyan Street, Bao'an District, Shenzhen 518000, China 
 Shenzhen Key Laboratory for Molecular Biology of Neural Development, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Key Laboratory of Brain Connectome and Manipulation, The Brain Cognition and Brain Disease Institute, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen, Guangdong 518055, China 
10  School of AI and Advanced Computing, XJTLU Entrepreneur College (Taicang), Xi'an Jiaotong-Liverpool University, Suzhou, Jiangsu 215123, China 
11  Institute of Robotics and Intelligent Manufacturing, The Chinese University of Hong Kong (Shenzhen), Shenzhen 518172, China; Shenzhen Institute of Artificial Intelligence and Robotics for Society, Shenzhen 518172, China 
Editor
Miaolei Zhou
Publication year
2022
Publication date
2022
Publisher
John Wiley & Sons, Inc.
ISSN
16875265
e-ISSN
16875273
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2660748446
Copyright
Copyright © 2022 Guobin Wan et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/