It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
With increasingly more smart cameras deployed in infrastructure and commercial buildings, 3D reconstruction can quickly obtain cities’ information and improve the efficiency of government services. Images collected in outdoor hazy environments are prone to color distortion and low contrast; thus, the desired visual effect cannot be achieved and the difficulty of target detection is increased. Artificial intelligence (AI) solutions provide great help for dehazy images, which can automatically identify patterns or monitor the environment. Therefore, we propose a 3D reconstruction method of dehazed images for smart cities based on deep learning. First, we propose a fine transmission image deep convolutional regression network (FT-DCRN) dehazing algorithm that uses fine transmission image and atmospheric light value to compute dehazed image. The DCRN is used to obtain the coarse transmission image, which can not only expand the receptive field of the network but also retain the features to maintain the nonlinearity of the overall network. The fine transmission image is obtained by refining the coarse transmission image using a guided filter. The atmospheric light value is estimated according to the position and brightness of the pixels in the original hazy image. Second, we use the dehazed images generated by the FT-DCRN dehazing algorithm for 3D reconstruction. An advanced relaxed iterative fine matching based on the structure from motion (ARI-SFM) algorithm is proposed. The ARI-SFM algorithm, which obtains the fine matching corner pairs and reduces the number of iterations, establishes an accurate one-to-one matching corner relationship. The experimental results show that our FT-DCRN dehazing algorithm improves the accuracy compared to other representative algorithms. In addition, the ARI-SFM algorithm guarantees the precision and improves the efficiency.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer