Full Text

Turn on search term navigation

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

This study proposes a novel approach to estimate canopy density in Picea Schrenkiana var. Tianschanica forest sub-compartments by integrating optical and radar satellite data. This effort is aimed at enhancing methodologies for forest resource surveys and monitoring, particularly vital for the sustainable development of semi-arid mountainous areas with fragile ecological environments. The study area is the West Tianshan Mountain Nature Reserve in Xinjiang, which is characterized by its unique dominant tree species, Picea Schrenkiana. A total of 411 characteristic factors were extracted from Gaofen-2 (GF-2) sub-meter optical satellite imagery, Gaofen-3 (GF-3) multi-polarization synthetic aperture radar satellite imagery, and digital elevation model (DEM) data. Consequently, 17 characteristic parameters were selected based on their correlation with canopy density data to construct an estimation model. Three distinct models were developed, including a multiple stepwise regression model (a linear approach), a Back Propagation (BP) neural network model (a neural network-based method), and a Cubist model (a decision tree-based technique). The results indicate that combining optical and radar image characteristics significantly enhances accuracy, with an Average Absolute Percentage Precision (AAPP) value improvement in estimation accuracy from 76.50% (with optical image) and 78.50% (with radar image) to 78.66% (with both). Of the three models, the BP neural network model achieved the highest overall accuracy (79.19%). At the sub-component scale, the BP neural network model demonstrated superior accuracy in low canopy density estimation (75.37%), whereas the Cubist model, leveraging radar image characteristics, excelled in medium density estimations (87.46%). Notably, the integrated Cubist model combining optical and radar data achieved the highest accuracy for high canopy density estimation (89.17%). This study highlights the effectiveness of integrating optical and radar data for precise canopy density assessment, contributing significantly to ecological resource monitoring methodologies and environmental assessments.

Details

Title
Estimation of Picea Schrenkiana Canopy Density at Sub-Compartment Scale by Integration of Optical and Radar Satellite Images
Author
Wang, Yibo 1   VIAFID ORCID Logo  ; Li, Xusheng 2 ; Yang, Xiankun 3   VIAFID ORCID Logo  ; Qi, Wenchao 4 ; Zhang, Donghui 5   VIAFID ORCID Logo  ; Wang, Jinnian 3 

 School of Geography and Remote Sensing, Guangzhou University, Guangzhou 510006, China; [email protected] (Y.W.); [email protected] (X.Y.); [email protected] (J.W.); Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100101, China; [email protected] 
 Tianjin Centre of Geological Survey, China Geological Survey, Tianjin 300170, China; North China Center for Geoscience Innovation, China Geological Survey, Tianjin 300170, China 
 School of Geography and Remote Sensing, Guangzhou University, Guangzhou 510006, China; [email protected] (Y.W.); [email protected] (X.Y.); [email protected] (J.W.) 
 Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100101, China; [email protected] 
 Institute of Remote Sensing Satellite, China Academy of Space Technology, Beijing 100095, China; [email protected] 
First page
1145
Publication year
2024
Publication date
2024
Publisher
MDPI AG
e-ISSN
19994907
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3084924286
Copyright
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.