Content area

Abstract

Anomaly detection of surveillance video has become a critical concern in computer vision. It can be used for real-time monitoring and the timely generation of alarms and is widely applied in transportation systems and security systems. An unsupervised anomaly detection method for surveillance video based on frame prediction is implemented in this paper. Generative Adversarial Network (GAN) is used to generate the high-quality frame. Two generators are designed to predict the next future frame. Non-local U-Net is proposed as Generator 1 for frame prediction to predict the global information. Generator 2 obtains more related past frame features and large contour information. The predicted frame and the ground truth are compared to determine anomalies. We take spatial constraints during generative adversarial training, including gradient loss and intensity loss, and time constraints, such as optical flow loss, into account. We experimentally verify that the proposed method has better accuracy in surveillance videos than some other state-of-the-art anomaly detection algorithms.

Details

Title
Surveillance video anomaly detection via non-local U-Net frame prediction
Author
Zhang, Qianqian 1 ; Feng, Guorui 2   VIAFID ORCID Logo  ; Wu, Hanzhou 1 

 Shanghai University, School of Communication and Information Engineering, Shanghai, China (GRID:grid.39436.3b) (ISNI:0000 0001 2323 5732) 
 Shanghai University, School of Communication and Information Engineering, Shanghai, China (GRID:grid.39436.3b) (ISNI:0000 0001 2323 5732); Guangxi Normal University, Guangxi Key Lab of Multi-source Information Mining & Security, Guilin, China (GRID:grid.459584.1) (ISNI:0000 0001 2196 0260) 
Pages
27073-27088
Publication year
2022
Publication date
Aug 2022
Publisher
Springer Nature B.V.
ISSN
13807501
e-ISSN
15737721
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2691599833
Copyright
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2021.