Full Text

Turn on search term navigation

© 2021. This work is published under https://creativecommons.org/licenses/by-sa/4.0 (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Object detection gaining popularity and is more used on mobile devices for real-time video automated analysis. In this paper, the efficiency of the newly released YOLOv5 object detection model has been investigated. Experimental research has been performed to find out the efficiency of YOLOv5 using a mobile device with real-time object detection tasks. For this reason, four YOLOv5 model sizes have been used: small, medium, large, and extra-large. The experiments have been performed with a well-known COCO dataset. The original dataset consists of a huge number of images, so the dataset has been reduced to fit the mobile device requirements. The experimental investigation results have shown, that reducing the COCO dataset has no significant influence on the model accuracy, but the model performance is highly influenced by the hardware architecture and system where the model is used. Apple Network Engine usage might significantly increase the YOLOv5 model performance in comparison to CPU usage.

Details

Title
Investigation of YOLOv5 Efficiency in iPhone Supported Systems
Author
Dlužnevskij, Daniel 1 ; Stefanovic, Pavel 2 ; Ramanauskaite, Simona 3 

 Department of Electronic Systems, Vilnius Gediminas Technical University, Naugarduko g. 
 Department of Information Systems, Vilnius Gediminas Technical University, Saulėtekio al. 
 Department of Information Technology, Vilnius Gediminas Technical University, Sauletekio 
Pages
333-344
Publication year
2021
Publication date
2021
Publisher
University of Latvia
ISSN
22558942
e-ISSN
22558950
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2580354477
Copyright
© 2021. This work is published under https://creativecommons.org/licenses/by-sa/4.0 (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.