Content area

Abstract

Remote sensing satellites can simultaneously capture high spatial resolution panchromatic (PAN) images and low spatial resolution multispectral (MS) images. Pan-sharpening in the fusion of remote sensing images aims to generate high-resolution MS images by integrating the spatial information of PAN images and the spectral characteristics of MS images. In this study, a novel deep perceptual patch generative adversarial network (FDPPGAN) was proposed to solve the pan-sharpening problem. First, a perception generator was constructed, it included, a matching module, which can process as input images of different resolutions, a fusion module, a reconstruction module based on the residual structure, and a module for the extracting perceptual features. Second, patch discriminator was utilized to convert the dichotomy of the sample into that multiple partial images of the same size to ensure that the generated results can retain more detailed features. Finally, the loss function of FDPPGAN comprised perceptual feature loss, content loss, generator loss, and discriminator loss. Experiments on the QuickBird and WorldView datasets demonstrated that the proposed algorithm is superior to state-of-the-art algorithms in subjective and objective indexes.

Details

Title
FDPPGAN: remote sensing image fusion based on deep perceptual patchGAN
Author
Pan, Yue 1   VIAFID ORCID Logo  ; Pi Dechang 1 ; Chen Junfu 1 ; Han, Meng 2 

 Nanjing University of Aeronautics and Astronautics, College of Computer Science and Technology, Nanjing, China (GRID:grid.64938.30) (ISNI:0000 0000 9558 9911) 
 Beijing University of Posts and Telecommunications, School of Information and Communication Engineering, Beijing, China (GRID:grid.31880.32) 
Pages
9589-9605
Publication year
2021
Publication date
Aug 2021
Publisher
Springer Nature B.V.
ISSN
09410643
e-ISSN
14333058
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2549101816
Copyright
© The Author(s), under exclusive licence to Springer-Verlag London Ltd. part of Springer Nature 2021.