It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
Hyperspectral unmixing (HU) is considered one of the most important ways to improve hyperspectral image analysis. HU aims to break down the mixed pixel into a set of spectral signatures, often commonly referred to as endmembers, and determine the fractional abundance of those endmembers. Deep learning (DL) approaches have recently received great attention regarding HU. In particular, convolutional neural networks (CNNs)-based methods have performed exceptionally well in such tasks. However, the ability of CNNs to learn deep semantic features is limited, and computing cost increase dramatically with the number of layers. The appearance of the transformer addresses these issues by effectively representing high-level semantic features well. In this article, we present a novel approach for HU that utilizes a deep convolutional transformer network. Firstly, the CNN-based autoencoder (AE) is used to extract low-level features from the input image. Secondly, the concept of tokenizer is applied for feature transformation. Thirdly, the transformer module is used to capture the deep semantic features derived from the tokenizer. Finally, a convolutional decoder is utilized to reconstruct the input image. The experimental results on synthetic and real datasets demonstrate the effectiveness and superiority of the proposed method compared with other unmixing methods.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer