It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
The evolution of airborne mapping witnesses the introduction of hybrid lidar-camera systems to enhance data collection, i.e. to obtain simultaneously high-density point-cloud and texture. Yet, the common adjustment of both optical data streams is a non-trivial problem due to challenges associated with the different influences of errors affecting their mapping accuracy including those coming from navigation sensors. Stemming from a special form of graph-based optimization, the dynamic networks allow rigorous modeling of spatio-temporal constraints and thus provide the common framework for optimizing original observations from inertial systems with those coming from optical sensors. In this work, we propose a cross-domain observation model that leverages pixel-to-point correspondences as links between imagery and lidar returns. First, we describe how the existence of such correspondences can be introduced into optimizations. Second, we employ a reference dataset to emulate a set of precise pixel-to-point correspondences to assess its prospective impact on the common (rather than cascade) optimization. We report the improvement in the estimated trajectory attitude error with lower quality IMU and thus the point-cloud registration. Finally, we study whether such correspondences could be contained from freely available deep learning networks with the desired accuracy and quality. We conclude that although the introduction of such camera-to-lidar constraints has significant potential, none of the studied machine learning networks can fulfill the requirement on correspondence detection in terms of quality.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 Earth Sensing & Observation Laboratory (ESO), Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland; Earth Sensing & Observation Laboratory (ESO), Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland