Full text

Turn on search term navigation

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

InSAR and optical techniques represent two principal approaches for the generation of large-scale Digital Elevation Models (DEMs). Due to the inherent limitations of each technology, a single data source is insufficient to produce high-quality DEM products. The increasing deployment of satellites has generated vast amounts of InSAR and optical DEM data, thereby providing opportunities to enhance the quality of final DEM products through the more effective utilization of the existing data. Previous research has established that complete DEMs generated by InSAR technology can be combined with optical DEMs to produce a fused DEM with enhanced accuracy and reduced noise. Traditional DEM fusion methods typically employ weighted averaging to compute the fusion results. Theoretically, if the weights are appropriately selected, the fusion outcome can be optimized. However, in practical scenarios, DEMs frequently lack prior information on weights, particularly precise weight data. To address this issue, this study adopts a fully connected artificial neural network for elevation fusion prediction. This approach represents an advancement over existing neural network models by integrating local elevation and terrain as input features and incorporating curvature as an additional terrain characteristic to enhance the representation of terrain features. We also investigate the impact of terrain factors and local terrain feature as training features on the fused elevation outputs. Finally, three representative study areas located in Oregon, USA, and Macao, China, were selected for empirical validation. The terrain data comprise InSAR DEM, AW3D30 DEM, and Lidar DEM. The results indicate that compared to traditional neural network methods, the proposed approach improves the Root-Mean-Squared Error (RMSE) ranges, from 5.0% to 12.3%, and the Normalized Median Absolute Deviation (NMAD) ranges, from 10.3% to 26.6%, in the test areas, thereby validating the effectiveness of the proposed method.

Details

Title
Neural Network-Based Fusion of InSAR and Optical Digital Elevation Models with Consideration of Local Terrain Features
Author
Gui, Rong 1   VIAFID ORCID Logo  ; Qin, Yuanjun 2 ; Hu, Zhi 2 ; Jiazhen Dong 2 ; Sun, Qian 3 ; Hu, Jun 1   VIAFID ORCID Logo  ; Yuan, Yibo 2 ; Mo, Zhiwei 2 

 School of Geoscience and Info-Physics, Central South University, Changsha 410083, China; [email protected] (R.G.); [email protected] (Y.Q.); [email protected] (Z.H.); [email protected] (J.D.); [email protected] (J.H.); [email protected] (Y.Y.); [email protected] (Z.M.); Hunan Geological Disaster Monitoring, Early Warning and Emergency Rescue Engineering Technology Research Center, Changsha 410004, China 
 School of Geoscience and Info-Physics, Central South University, Changsha 410083, China; [email protected] (R.G.); [email protected] (Y.Q.); [email protected] (Z.H.); [email protected] (J.D.); [email protected] (J.H.); [email protected] (Y.Y.); [email protected] (Z.M.) 
 College of Geographic Science, Hunan Normal University, Changsha 410081, China; Key Laboratory of Geospatial Big Data Mining and Application, Changsha 410081, China 
First page
3567
Publication year
2024
Publication date
2024
Publisher
MDPI AG
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3116659069
Copyright
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.