Full text

Turn on search term navigation

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Hyperspectral systems integrated on unmanned aerial vehicles (UAV) provide unique opportunities to conduct high-resolution multitemporal spectral analysis for diverse applications. However, additional time-consuming rectification efforts in postprocessing are routinely required, since geometric distortions can be introduced due to UAV movements during flight, even if navigation/motion sensors are used to track the position of each scan. Part of the challenge in obtaining high-quality imagery relates to the lack of a fast processing workflow that can retrieve geometrically accurate mosaics while optimizing the ground data collection efforts. To address this problem, we explored a computationally robust automated georectification and mosaicking methodology. It operates effectively in a parallel computing environment and evaluates results against a number of high-spatial-resolution datasets (mm to cm resolution) collected using a push-broom sensor and an associated RGB frame-based camera. The methodology estimates the luminance of the hyperspectral swaths and coregisters these against a luminance RGB-based orthophoto. The procedure includes an improved coregistration strategy by integrating the Speeded-Up Robust Features (SURF) algorithm, with the Maximum Likelihood Estimator Sample Consensus (MLESAC) approach. SURF identifies common features between each swath and the RGB-orthomosaic, while MLESAC fits the best geometric transformation model to the retrieved matches. Individual scanlines are then geometrically transformed and merged into a single spatially continuous mosaic reaching high positional accuracies only with a few number of ground control points (GCPs). The capacity of the workflow to achieve high spatial accuracy was demonstrated by examining statistical metrics such as RMSE, MAE, and the relative positional accuracy at 95% confidence level. Comparison against a user-generated georectification demonstrates that the automated approach speeds up the coregistration process by 85%.

Details

Title
Automated Georectification and Mosaicking of UAV-Based Hyperspectral Imagery from Push-Broom Sensors
Author
Yoseline Angel 1   VIAFID ORCID Logo  ; Turner, Darren 2   VIAFID ORCID Logo  ; Parkes, Stephen 1 ; Malbeteau, Yoann 3   VIAFID ORCID Logo  ; Lucieer, Arko 2   VIAFID ORCID Logo  ; McCabe, Matthew F 1   VIAFID ORCID Logo 

 Hydrology, Agriculture and Land Observation Group (HALO), Division of Biological and Environmental Science and Engineering, King Abdullah University of Science and Technology, Thuwal 23955-6900, Saudi Arabia; [email protected] (S.P.); [email protected] (Y.M.); [email protected] (M.F.M.) 
 Discipline of Geography and Spatial Sciences, College of Sciences and Engineering, University of Tasmania, Hobart 7001, Australia; [email protected] (D.T.); [email protected] (A.L.) 
 Hydrology, Agriculture and Land Observation Group (HALO), Division of Biological and Environmental Science and Engineering, King Abdullah University of Science and Technology, Thuwal 23955-6900, Saudi Arabia; [email protected] (S.P.); [email protected] (Y.M.); [email protected] (M.F.M.); Center for Remote Sensing Applications (CRSA), Mohammed VI Polytechnic University, Ben Guerir 43150, Morocco 
First page
34
Publication year
2020
Publication date
2020
Publisher
MDPI AG
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2550316770
Copyright
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.