Abstract

Industry 4.0 is one of the most formative terms in current times. Subject of research are particularly smart and autonomous mobile platforms, which enormously lighten the workload and optimize production processes. In order to interact with humans, the platforms need an in-depth knowledge of the environment. Hence, it is required to detect a variety of static and non-static objects. Goal of this paper is to propose an accurate and real-time capable object detection and localization approach for the use on mobile platforms. A method is introduced to use the powerful detection capabilities of a neural network for the localization of objects. Therefore, detection information of a neural network is combined with depth information from a RGB-D camera, which is mounted on a mobile platform. As detection network, YOLO Version 2 (YOLOv2) is used on a mobile robot. In order to find the detected object in the depth image, the bounding boxes, predicted by YOLOv2, are mapped to the corresponding regions in the depth image. This provides a powerful and extremely fast approach for establishing a real-time-capable Object Locator. In the evaluation part, the localization approach turns out to be very accurate. Nevertheless, it is dependent on the detected object itself and some additional parameters, which are analysed in this paper.

Details

Title
Deep Learning for Real-Time Capable Object Detection and Localization on Mobile Platforms
Author
Particke, F 1 ; Kolbenschlag, R 1 ; Hiller, M 1 ; Patiño-Studencki, L 1 ; Thielecke, J 1 

 Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), Information Technologies, Erlangen, Gersmany 
Publication year
2017
Publication date
Oct 2017
Publisher
IOP Publishing
ISSN
17578981
e-ISSN
1757899X
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2564366207
Copyright
© 2017. This work is published under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.