Full Text

Turn on search term navigation

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Remotely sensed images with low resolution can be effectively used for the large-area monitoring of vegetation restoration, but are unsuitable for accurate small-area monitoring. This limits researchers’ ability to study the composition of vegetation species and the biodiversity and ecosystem functions after ecological restoration. Therefore, this study uses LiDAR and hyperspectral data, develops a hierarchical classification method for classifying vegetation based on LiDAR technology, decision tree and a random forest classifier, and applies it to the eastern waste dump of the Heidaigou mining area in Inner Mongolia, China, which has been restored for around 15 years, to verify the effectiveness of the method. The results were as follows. (1) The intensity, height, and echo characteristics of LiDAR point cloud data and the spectral, vegetation indices, and texture features of hyperspectral image data effectively reflected the differences in vegetation species composition. (2) Vegetation indices had the highest contribution rate to the classification of vegetation species composition types, followed by height, while spectral data alone had a lower contribution rate. Therefore, it was necessary to screen the features of LiDAR and hyperspectral data before classifying vegetation. (3) The hierarchical classification method effectively distinguished the differences between trees (Populus spp., Pinus tabuliformis, Hippophae sp. (arbor), and Robinia pseudoacacia), shrubs (Amorpha fruticosa, Caragana microphylla + Hippophae sp. (shrub)), and grass species, with classification accuracy of 87.45% and a Kappa coefficient of 0.79, which was nearly 43% higher than an unsupervised classification and 10.7–22.7% higher than other supervised classification methods. In conclusion, the fusion of LiDAR and hyperspectral data can accurately and reliably estimate and classify vegetation structural parameters, and reveal the type, quantity, and diversity of vegetation, thus providing a sufficient basis for the assessment and improvement of vegetation after restoration.

Details

Title
Revealing the Structure and Composition of the Restored Vegetation Cover in Semi-Arid Mine Dumps Based on LiDAR and Hyperspectral Images
Author
Tang, Jiajia 1 ; Liang, Jie 2 ; Yang, Yongjun 3 ; Zhang, Shaoliang 3   VIAFID ORCID Logo  ; Hou, Huping 3 ; Zhu, Xiaoxiao 3   VIAFID ORCID Logo 

 Engineering Research Center of Ministry of Education for Mine Ecological Restoration, China University of Mining and Technology, Xuzhou 221008, China; [email protected] (J.T.); [email protected] (S.Z.); [email protected] (H.H.); [email protected] (X.Z.); School of Environment and Spatial Informatics, China University of Mining and Technology, Xuzhou 221008, China 
 College of Land Science and Technology, China Agricultural University, Beijing 100083, China; [email protected]; Institute of Territorial and Spatial Planning of Inner Mongolia, Hohhot 010070, China 
 Engineering Research Center of Ministry of Education for Mine Ecological Restoration, China University of Mining and Technology, Xuzhou 221008, China; [email protected] (J.T.); [email protected] (S.Z.); [email protected] (H.H.); [email protected] (X.Z.) 
First page
978
Publication year
2022
Publication date
2022
Publisher
MDPI AG
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2633146224
Copyright
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.