Content area

Abstract

Automating productivity monitoring is crucial for improving the construction industry. To measure productivity, we should identify which worker works for what object and their relationship. The lack of understanding between human and object interaction in a large-scale format from video surveillance has become a significant challenge in construction sites. However, the existing vision-based studies only focus on object detection and activity recognition, which do not recognize workers, objects, and actions simultaneously. This situation makes managers unable to measure the productivity of the workers effectively. To address the issue, this study applies the HOI technique, which consists of object detection tasks and interaction prediction tasks through faster R-CNN and graph neural networks (GNN). There are two groups of actions in this interaction, including productive (installing, preparing, and transporting) and non-productive actions (no interaction) on the formwork structure. Our model achieves 0.674, 0.556, and 0.632 mAP scores of the local area, global area, and average area of the objects sequentially, indicating that the model can monitor construction productivity effectively. For future studies, utilizing more information, such as temporal and body postures of workers, can potentially improve the performance of the HOI model for the productivity monitoring process.

Details

Title
Video-Based Productivity Monitoring of Worker and Large-Scale Object Interactions in Construction Sites
Volume
42
Pages
580-587
Number of pages
9
Publication year
2025
Publication date
2025
Publisher
IAARC Publications
Place of publication
Waterloo
Country of publication
Canada
Publication subject
Source type
Conference Paper
Language of publication
English
Document type
Journal Article
ProQuest document ID
3240508777
Document URL
https://www.proquest.com/conference-papers-proceedings/video-based-productivity-monitoring-worker-large/docview/3240508777/se-2?accountid=208611
Copyright
Copyright IAARC Publications 2025
Last updated
2025-09-03
Database
2 databases
  • ProQuest One Academic
  • ProQuest One Academic