Abstract

For smart living applications, personal identification as well as behavior and emotion detection becomes more and more important in our daily life. For identity classification and facial expression detection, facial features extracted from face images are the most popular and low-cost information. The face shape in terms of landmarks estimated by a face alignment method can be used for many applications including virtual face animation and real face classification. In this paper, we propose a robust face alignment method based on the multi-feature shape regression (MSR), which is evolved from the explicit shape regression (ESR) proposed in Cao et al. (Int, Vis, 2014, 107:177–190, Comput). The proposed MSR face alignment method successfully utilizes color, gradient, and regional information to increase accuracy of landmark estimation. For face recognition algorithms, we further suggest a face warping algorithm, which can cooperate with any face alignment algorithm to adjust facial pose variations to improve their recognition performances. For performance evaluations, the proposed and the existing face alignment methods are compared on the face alignment database. Based on alignment-based face recognition concept, the face alignment methods with the proposed face warping method are tested on the face database. Simulation results verify that the proposed MSR face alignment method achieves better performances than the other existing face alignment methods.

Details

Title
Multi-feature shape regression for face alignment
Author
Wei-Jong, Yang 1 ; Yi-Chen, Chen 1 ; Pau-Choo, Chung 1 ; Yang, Jar-Ferr 1 

 Department of Electrical Engineering, Institute of Computer and Communication Engineering, National Cheng Kung University, Tainan, Taiwan 
Pages
1-13
Publication year
2018
Publication date
Aug 2018
Publisher
Springer Nature B.V.
ISSN
16876172
e-ISSN
16876180
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2081958707
Copyright
EURASIP Journal on Advances in Signal Processing is a copyright of Springer, (2018). All Rights Reserved., © 2018. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.