Full text

Turn on search term navigation

Copyright © 2015 Xue-he Zhang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

To fulfill the applications on robot vision, the commonly used stereo matching method for depth estimation is supposed to be efficient in terms of running speed and disparity accuracy. Based on this requirement, Delaunay-based stereo matching method is proposed to achieve the aforementioned standards in this paper. First, a Canny edge operator is used to detect the edge points of an image as supporting points. Those points are then processed using a Delaunay triangulation algorithm to divide the whole image into a series of linked triangular facets. A proposed module composed of these facets performs a rude estimation of image disparity. According to the triangular property of shared vertices, the estimated disparity is then refined to generate the disparity map. The method is tested on Middlebury stereo pairs. The running time of the proposed method is about 1 s and the matching accuracy is 93%. Experimental results show that the proposed method improves both running speed and disparity accuracy, which forms a steady foundation and good application prospect for a robot's path planning system with stereo camera devices.

Details

Title
Stereo Matching Algorithm Based on 2D Delaunay Triangulation
Author
Xue-he, Zhang; Li, Ge; Chang-le, Li; Zhang, He; Zhao, Jie; Hou, Zhen-xiu
Publication year
2015
Publication date
2015
Publisher
John Wiley & Sons, Inc.
ISSN
1024123X
e-ISSN
15635147
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
1716886822
Copyright
Copyright © 2015 Xue-he Zhang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.