Content area

Abstract

ESRGAN is a generative adversarial network that produces visually pleasing super-resolution (SR) images with high perceptual quality from low-resolution images. However, it frequently fails to recover local details, resulting in blurry or unnatural visual artifacts. To address this problem, we propose using an additional perceptual loss (computed using the pretrained PieAPP network) for training the generator, adding skip connections to the discriminator to use a combination of features with different scales, and replacing the Leaky ReLU activation functions in the discriminator with the ReLU ones. Through ×4 SR experiments utilizing real and computer-generated image benchmark datasets, it is demonstrated that the proposed method can produce SR images with significantly higher perceptual quality than ESRGAN and other ESRGAN enhancements. Specifically, when compared to ESRGAN, the proposed method resulted in 5.95 higher DMOS values, 0.46 lower PI values, and 0.01 lower LPIPS values. The source code is accessible at https://github.com/cyun-404/PieESRGAN.

Details

Title
Improving ESRGAN with an additional image quality loss
Author
Choi, Yoonsil 1 ; Park, Hanhoon 2   VIAFID ORCID Logo 

 Pukyong National University, Department of Computer Engineering, Busan, Republic of Korea (GRID:grid.412576.3) (ISNI:0000 0001 0719 8994) 
 Pukyong National University, Department of Electronic Engineering, Busan, Republic of Korea (GRID:grid.412576.3) (ISNI:0000 0001 0719 8994) 
Pages
3123-3137
Publication year
2023
Publication date
Jan 2023
Publisher
Springer Nature B.V.
ISSN
13807501
e-ISSN
15737721
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2760355321
Copyright
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.