Full Text

Turn on search term navigation

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Spectral unmixing remains the most popular method for estimating the composition of mixed pixels. However, the spectral-based unmixing method cannot easily distinguish vegetation with similar spectral characteristics (e.g., different forest tree species). Furthermore, in large areas with significant heterogeneity, extracting a large number of pure endmember samples is challenging. Here, we implement a fractional evergreen forest cover-self-adaptive parameter (FEVC-SAP) approach to measure FEVC at the regional scale from continuous intra-year time-series normalized difference vegetation index (NDVI) values derived from moderate resolution imaging spectroradiometer (MODIS) imagery acquired over southern China, an area with a complex mixture of temperate, subtropical, and tropical climates containing evergreen and deciduous forests. Considering the cover of evergreen forest as a fraction of total forest (evergreen forest plus non-evergreen forest), the dimidiate pixel model combined with an index of evergreen forest phenological characteristics (NDVIann-min: intra-annual minimum NDVI value) was used to distinguish between evergreen and non-evergreen forests within a pixel. Due to spatial heterogeneity, the optimal model parameters differ among regions. By dividing the study area into grids, our method converts image spectral information into gray level information and uses the Otsu threshold segmentation method to simulate the appropriate parameters for each grid for adaptive acquisition of FEVC parameters. Mapping accuracy was assessed at the pixel and sub-pixel scales. At the pixel scale, a confusion matrix was constructed with higher overall accuracy (87.5%) of evergreen forest classification than existing land cover products, including GLC 30 and MOD12. At the sub-pixel scale, a strong linear correlation was found between the cover fraction predicted by our method and the reference cover fraction obtained from GF-1 images (R2 = 0.86). Compared to other methods, the FEVC-SAP had a lower estimation deviation (root mean square error = 8.6%). Moreover, the proposed method had greater estimation accuracy in densely than sparsely forested areas. Our results highlight the utility of the adaptive-parameter linear unmixing model for quantitative evaluation of the coverage of evergreen forest and other vegetation types at large scales.

Details

Title
An Adaptive-Parameter Pixel Unmixing Method for Mapping Evergreen Forest Fractions Based on Time-Series NDVI: A Case Study of Southern China
Author
Yang, Yingying 1   VIAFID ORCID Logo  ; Wu, Taixia 2 ; Zeng, Yuhui 2 ; Wang, Shudong 3 

 School of Earth Sciences and Engineering, Hohai University, Nanjing 210098, China; [email protected] (Y.Y.); [email protected] (T.W.); [email protected] (Y.Z.); Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, Beijing 100101, China 
 School of Earth Sciences and Engineering, Hohai University, Nanjing 210098, China; [email protected] (Y.Y.); [email protected] (T.W.); [email protected] (Y.Z.) 
 Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100101, China 
First page
4678
Publication year
2021
Publication date
2021
Publisher
MDPI AG
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2602184627
Copyright
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.