It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
The ability to form reconstructions beyond line-of-sight view could be transformative in a variety of fields, including search and rescue, autonomous vehicle navigation, and reconnaissance. Most existing active non-line-of-sight (NLOS) imaging methods use data collection steps in which a pulsed laser is directed at several points on a relay surface, one at a time. The prevailing approaches include raster scanning of a rectangular grid on a vertical wall opposite the volume of interest to generate a collection of confocal measurements. These and a recent method that uses a horizontal relay surface are inherently limited by the need for laser scanning. Methods that avoid laser scanning to operate in a snapshot mode are limited to treating the hidden scene of interest as one or two point targets. In this work, based on more complete optical response modeling yet still without multiple illumination positions, we demonstrate accurate reconstructions of foreground objects while also introducing the capability of mapping the stationary scenery behind moving objects. The ability to count, localize, and characterize the sizes of hidden objects, combined with mapping of the stationary hidden scene, could greatly improve indoor situational awareness in a variety of applications.
Most non-line-of-sight imaging requires scanned illumination, limiting applicability for dynamic scenes. Here the authors exploit occlusion and a sensor array to estimate locations and sizes of moving foreground objects and a static background map.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details






1 Boston University, Electrical and Computer Engineering, Boston, USA (GRID:grid.189504.1) (ISNI:0000 0004 1936 7558); Charles Stark Draper Laboratory, Cambridge, USA (GRID:grid.417533.7) (ISNI:0000 0004 0634 6125)
2 Boston University, Electrical and Computer Engineering, Boston, USA (GRID:grid.189504.1) (ISNI:0000 0004 1936 7558); Universidad Industrial de Santander, Computer Science, Bucaramanga, Colombia (GRID:grid.411595.d) (ISNI:0000 0001 2105 7207)
3 Dip. Elettronica, Informazione e Bioingegneria, Politecnico di Milano, Milano, Italy (GRID:grid.4643.5) (ISNI:0000 0004 1937 0327)
4 Charles Stark Draper Laboratory, Cambridge, USA (GRID:grid.417533.7) (ISNI:0000 0004 0634 6125)
5 Boston University, Electrical and Computer Engineering, Boston, USA (GRID:grid.189504.1) (ISNI:0000 0004 1936 7558)