Full text

Turn on search term navigation

© 2023. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Human behaviour recognition plays a crucial role in the field of smart education. It offers a nuanced understanding of teaching and learning dynamics by revealing the behaviours of both teachers and students. In this study, to address the exigencies of teaching behaviour analysis in smart education, we first constructed a teaching behaviour analysis dataset called EuClass. EuClass contains 13 types of teacher/student behaviour categories and provides multi-view, multi-scale video data for the research and practical applications of teacher/student behaviour recognition. We also provide a teaching behaviour analysis network containing an attention-based network and an intra-class differential representation learning module. The attention mechanism uses a two-level attention module encompassing spatial and channel dimensions. The intra-class differential representation learning module utilised a unified loss function to reduce the distance between features. Experiments conducted on the EuClass dataset and a widely used action/gesture recognition dataset, IsoGD, demonstrate the effectiveness of our method in comparison to current state-of-the-art methods, with the recognition accuracy increased by 1-2% on average.

Details

Title
Multi-view and multi-scale behavior recognition algorithm based on attention mechanism
Author
Zhang, Di; Chen, Chen; Tan, Fa; Qian, Beibei; Li, Wei; He, Xuan; Lei, Susan
Section
METHODS article
Publication year
2023
Publication date
Sep 26, 2023
Publisher
Frontiers Research Foundation
e-ISSN
16625218
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2868489929
Copyright
© 2023. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.