Content area

Abstract

The feed-forward architectures of recently proposed generative adversarial network can learn the non-linear mapping from low-resolution output to high-resolution output. However, this approach does not fully address the mutual dependencies of different resolution images. By analyzing the zero-sum game, the paper proposes an image enhancement algorithm by using conditional generative adversarial networks based on improved non-saturating game. Firstly, the enhancement image obtained by the GAN model is adopted as a condition against the network object image, making the original image learning the network structure of the object image with dim-small. Our proposed generative adversarial networks can obtain a clearer image through improved non-saturating game, and it can still get a large gradient and sufficient learning, which makes up for the deficiencies in the mini-maximum game. In addition, the loss function of the network adds the loss of discriminator to guide discriminator to generate high quality images. We compared the proposed method (SRG) with other methods including SC, SRCNN, VESPCN and ESPCN, and the proposed method resulted in obvious improvements in the peak signal-to-noise ratio (PSNR) by 2.348 dB and in structural similarity index measurement (SSIM) by 1.89% to enhance the visual effects of nature images.

Details

Title
Image enhancement algorithm based on generative adversarial network in combination of improved game adversarial loss mechanism
Author
Xu Caie 1 ; Cui, Yang 2 ; Zhang, Yunhui 2 ; Gao, Peng 2 ; Xu, Jiayi 2 

 University of Yamanashi, Faculty of Engineering, Kofu, Japan (GRID:grid.267500.6) (ISNI:0000 0001 0291 3581) 
 Hangzhou Dianzi University, School of Computer Science and Technology, Hangzhou, China (GRID:grid.411963.8) (ISNI:0000 0000 9804 6672) 
Pages
9435-9450
Publication year
2020
Publication date
Apr 2020
Publisher
Springer Nature B.V.
ISSN
13807501
e-ISSN
15737721
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2234823081
Copyright
© Springer Science+Business Media, LLC, part of Springer Nature 2019.