Full Text

Turn on search term navigation

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

In recent years, hyperspectral image classification techniques have attracted a lot of attention from many scholars because they can be used to model the development of different cities and provide a reference for urban planning and construction. However, due to the difficulty in obtaining hyperspectral images, only a limited number of pixels can be used as training samples. Therefore, how to adequately extract and utilize the spatial and spectral information of hyperspectral images with limited training samples has become a difficult problem. To address this issue, we propose a hyperspectral image classification method based on dense pyramidal convolution and multi-feature fusion (DPCMF). In this approach, two branches are designed to extract spatial and spectral features, respectively. In the spatial branch, dense pyramid convolutions and non-local blocks are used to extract multi-scale local and global spatial features in image samples, which are then fused to obtain spatial features. In the spectral branch, dense pyramidal convolution layers are used to extract spectral features in image samples. Finally, the spatial and spectral features are fused and fed into fully connected layers to obtain classification results. The experimental results show that the overall accuracy (OA) of the method proposed in this paper is 96.74%, 98.10%, 98.92% and 96.67% on the four hyperspectral datasets, respectively. Significant improvements are achieved compared to the five methods of SVM, SSRN, FDSSC, DBMA and DBDA for hyperspectral classification. Therefore, the proposed method can better extract and exploit the spatial and spectral information in image samples when the number of training samples is limited. Provide more realistic and intuitive terrain and environmental conditions for urban planning, design, construction and management.

Details

Title
Hyperspectral Image Classification Based on Dense Pyramidal Convolution and Multi-Feature Fusion
Author
Zhang, Junsan 1 ; Zhao, Li 2 ; Jiang, Hongzhao 3 ; Shen, Shigen 4   VIAFID ORCID Logo  ; Wang, Jian 5   VIAFID ORCID Logo  ; Zhang, Peiying 1   VIAFID ORCID Logo  ; Zhang, Wei 6 ; Wang, Leiquan 2 

 College of Computer Science and Technology, China University of Petroleum (East China), Qingdao 266580, China; [email protected] (J.Z.); [email protected] (L.Z.); [email protected] (P.Z.); [email protected] (L.W.); State Key Laboratory of Integrated Services Networks, Xidian University, Xi’an 710071, China 
 College of Computer Science and Technology, China University of Petroleum (East China), Qingdao 266580, China; [email protected] (J.Z.); [email protected] (L.Z.); [email protected] (P.Z.); [email protected] (L.W.) 
 The Sixth Research Institute of China Electronics Corporation, Beijing 100083, China; [email protected] 
 School of Information Engineering, Huzhou University, Huzhou 313000, China 
 College of Science, China University of Petroleum (East China), Qingdao 266580, China; [email protected] 
 Shandong Provincial Key Laboratory of Computer Networks, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan 250013, China; [email protected] 
First page
2990
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2829887484
Copyright
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.