It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
Currently, intraoperative guidance tools used for brain tumor resection assistance during surgery have several limitations. Hyperspectral (HS) imaging is arising as a novel imaging technique that could offer new capabilities to delineate brain tumor tissue in surgical-time. However, the HS acquisition systems have some limitations regarding spatial and spectral resolution depending on the spectral range to be captured. Image fusion techniques combine information from different sensors to obtain an HS cube with improved spatial and spectral resolution. This paper describes the contributions to HS image fusion using two push-broom HS cameras, covering the visual and near-infrared (VNIR) [400–1000 nm] and near-infrared (NIR) [900–1700 nm] spectral ranges, which are integrated into an intraoperative HS acquisition system developed to delineate brain tumor tissue during neurosurgical procedures. Both HS images were registered using intensity-based and feature-based techniques with different geometric transformations to perform the HS image fusion, obtaining an HS cube with wide spectral range [435–1638 nm]. Four HS datasets were captured to verify the image registration and the fusion process. Moreover, segmentation and classification methods were evaluated to compare the performance results between the use of the VNIR and NIR data, independently, with respect to the fused data. The results reveal that the proposed methodology for fusing VNIR–NIR data improves the classification results up to 21% of accuracy with respect to the use of each data modality independently, depending on the targeted classification problem.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 University of Las Palmas de Gran Canaria, Institute for Applied Microelectronics, Las Palmas de Gran Canaria, Spain (GRID:grid.4521.2) (ISNI:0000 0004 1769 9380)
2 University of Las Palmas de Gran Canaria, Institute for Applied Microelectronics, Las Palmas de Gran Canaria, Spain (GRID:grid.4521.2) (ISNI:0000 0004 1769 9380); Norwegian Institute of Food Fisheries and Aquaculture Research, Nofima, Tromsø, Norway (GRID:grid.22736.32) (ISNI:0000 0004 0451 2652)
3 University Hospital Doctor Negrin of Gran Canaria, Department of Neurosurgery, Instituto de Investigación Sanitaria de Canarias (IISC), Las Palmas de Gran Canaria, Spain (GRID:grid.22736.32)
4 University Hospital Doctor Negrin of Gran Canaria, Research Unit, Instituto de Investigación Sanitaria de Canarias (IISC), Las Palmas de Gran Canaria, Spain (GRID:grid.22736.32)