Abstract

Previous work on iris recognition focused on either visible light (VL), near-infrared (NIR) imaging, or their fusion. However, limited numbers of works have investigated cross-spectral matching or compared the iris biometric performance under both VL and NIR spectrum using unregistered iris images taken from the same subject. To the best of our knowledge, this is the first work that proposes a framework for cross-spectral iris matching using unregistered iris images. To this end, three descriptors are proposed namely, Gabor-difference of Gaussian (G-DoG), Gabor-binarized statistical image feature (G-BSIF), and Gabor-multi-scale Weberface (G-MSW) to achieve robust cross-spectral iris matching. In addition, we explore the differences in iris recognition performance across the VL and NIR spectra. The experiments are carried out on the UTIRIS database which contains iris images acquired with both VL and NIR spectra for the same subject. Experimental and comparison results demonstrate that the proposed framework achieves state-of-the-art cross-spectral matching. In addition, the results indicate that the VL and NIR images provide complementary features for the iris pattern and their fusion improves notably the recognition performance.

Details

Title
A novel framework for cross-spectral iris matching
Author
Abdullah, Mohammed A M 1   VIAFID ORCID Logo  ; Dlay, Satnam S 2 ; Woo, Wai L 2 ; Chambers, Jonathon A 2 

 ComS2IP Group, School of Electrical and Electronic Engineering, Newcastle University, England, UK; Department of Computer and Information Engineering, Ninevah University, Nineveh, Iraq 
 ComS2IP Group, School of Electrical and Electronic Engineering, Newcastle University, England, UK 
Pages
1-11
Publication year
2016
Publication date
2016
Publisher
Springer Nature B.V.
e-ISSN
1882-6695
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2316848806
Copyright
IPSJ Transactions on Computer Vision and Applications is a copyright of Springer, (2016). All Rights Reserved., © 2016. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.