Content area

Abstract

Medical endoscopic video processing requires real-time execution of color component acquisition, color filter array (CFA) demosaicing, and high dynamic range (HDR) compression under low-light conditions, while adhering to strict thermal constraints within the surgical handpiece. Traditional hardware-aware neural architecture search (NAS) relies on fixed hardware design spaces, making it difficult to balance accuracy, power consumption, and real-time performance. A collaborative “power-accuracy” optimization method is proposed for hardware-aware NAS. Firstly, we proposed a novel hardware modeling framework by abstracting FPGA heterogeneous resources into unified cell units and establishing a power–temperature closed-loop model to ensure that the handpiece surface temperature does not exceed clinical thresholds. In this framework, we constrained the interstage latency balance in pipelines to avoid routing congestion and frequency degradation caused by deep pipelines. Then, we optimized the NAS strategy by using pipeline blocks and combined with a hardware efficiency reward function. Finally, color component acquisition, CFA demosaicing, dynamic range compression, dynamic precision quantization, and streaming architecture are integrated into our framework. Experiments demonstrate that the proposed method achieves 2.8 W power consumption at 47 °C on a Xilinx ZCU102 platform, with a 54% improvement in throughput (vs. hardware-aware NAS), providing an engineer-ready lightweight network for medical edge devices such as endoscopes.

Details

1009240
Company / organization
Title
Hardware-Aware Neural Architecture Search for Real-Time Video Processing in FPGA-Accelerated Endoscopic Imaging
Author
Zhang Cunguang 1 ; Cui Rui 2 ; Wang, Gang 3 ; Gao, Tong 4 ; Jielu, Yan 1   VIAFID ORCID Logo  ; Weizhi, Xian 1   VIAFID ORCID Logo  ; Xuekai, Wei 1 ; Qin, Yi 1   VIAFID ORCID Logo 

 College of Computer Science, Chongqing University, Chongqing 400044, China; [email protected] (C.Z.); [email protected] (G.W.); [email protected] (T.G.); [email protected] (J.Y.); [email protected] (W.X.); [email protected] (X.W.) 
 College of Computer Science, Chongqing University, Chongqing 400044, China; [email protected] (C.Z.); [email protected] (G.W.); [email protected] (T.G.); [email protected] (J.Y.); [email protected] (W.X.); [email protected] (X.W.), East China Institute of Digital Medical Engineering, Shangrao 334000, China 
 College of Computer Science, Chongqing University, Chongqing 400044, China; [email protected] (C.Z.); [email protected] (G.W.); [email protected] (T.G.); [email protected] (J.Y.); [email protected] (W.X.); [email protected] (X.W.), School of Computing and Data Engineering, NingboTech University, Ningbo 315100, China 
 College of Computer Science, Chongqing University, Chongqing 400044, China; [email protected] (C.Z.); [email protected] (G.W.); [email protected] (T.G.); [email protected] (J.Y.); [email protected] (W.X.); [email protected] (X.W.), School of Instrumentation Science and Opto-Electronics Engineering, Beihang University, Beijing 100083, China 
Publication title
Volume
15
Issue
20
First page
11200
Number of pages
29
Publication year
2025
Publication date
2025
Publisher
MDPI AG
Place of publication
Basel
Country of publication
Switzerland
Publication subject
e-ISSN
20763417
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Online publication date
2025-10-19
Milestone dates
2025-09-18 (Received); 2025-10-16 (Accepted)
Publication history
 
 
   First posting date
19 Oct 2025
ProQuest document ID
3265831119
Document URL
https://www.proquest.com/scholarly-journals/hardware-aware-neural-architecture-search-real/docview/3265831119/se-2?accountid=208611
Copyright
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-10-28
Database
ProQuest One Academic