Content area

Abstract

Face sketch-to-photo transformation aims at generates face photo images from sketched face images. Although transformations have progressed significantly with the development of deep learning techniques in recent years, generating face photos with realistic photo styles and rich facial details is still challenging. In this paper, a new realistic face sketch-to-photo transformation method is proposed based on the feature-filtered residual attention network (FRAN), which is able to propagate more precise feature information in the deep network. Specifically, a feature-filtered residual module is constructed by filtering feature maps in the residual block to filtrate short-term feature information. In addition, a decoder-guided attention module is designed to integrate and filtrate the long-term feature information. Moreover, to synthesize face photo images with more facial details, a Sobel operator-based detail loss is proposed to constrain the network training. The experimental results on the public datasets demonstrate that FRAN generates more realistic face photo images than state-of-the-art approaches in terms of visual perception and quality evaluation. Furthermore, the face photo images generated by FRAN obtain higher face recognition accuracy than those created by the compared methods.

Details

Title
FRAN: feature-filtered residual attention network for realistic face sketch-to-photo transformation
Author
Wan, Weiguo 1 ; Yang, Yong 2   VIAFID ORCID Logo  ; Huang, Shuying 3 ; Gan, Lixin 4 

 Jiangxi University of Finance and Economics, School of Software and Internet of Things Engineering, Nanchang, China (GRID:grid.453548.b) (ISNI:0000 0004 0368 7549) 
 Tiangong University, School of Computer Science and Technology, Tianjin, China (GRID:grid.410561.7) (ISNI:0000 0001 0169 5113) 
 Tiangong University, School of Software, Tianjin, China (GRID:grid.410561.7) (ISNI:0000 0001 0169 5113) 
 Jiangxi Science and Technology Normal University, School of Mathematics and Computer Science, Nanchang, China (GRID:grid.411864.e) (ISNI:0000 0004 1761 3022) 
Pages
15946-15956
Publication year
2023
Publication date
Jun 2023
Publisher
Springer Nature B.V.
ISSN
0924669X
e-ISSN
1573-7497
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2821147810
Copyright
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.