Content area

Abstract

Time cost is a major challenge in achieving high-quality pluralistic image completion. Recently, the Retentive Network (RetNet) in natural language processing offers a novel approach to this problem with its low-cost inference capabilities. Inspired by this, we apply RetNet to the pluralistic image completion task in computer vision. We present RetCompletion, a two-stage framework. In the first stage, we introduce Bi-RetNet, a bidirectional sequence information fusion model that integrates contextual information from images. During inference, we employ a unidirectional pixel-wise update strategy to restore consistent image structures, achieving both high reconstruction quality and fast inference speed. In the second stage, we use a CNN for low-resolution upsampling to enhance texture details. Experiments on ImageNet and CelebA-HQ demonstrate that our inference speed is 10\(\times\) faster than ICT and 15\(\times\) faster than RePaint. The proposed RetCompletion significantly improves inference speed and delivers strong performance.

Details

1009240
Title
RetCompletion:High-Speed Inference Image Completion with Retentive Network
Publication title
arXiv.org; Ithaca
Publication year
2024
Publication date
Dec 4, 2024
Section
Computer Science
Publisher
Cornell University Library, arXiv.org
Source
arXiv.org
Place of publication
Ithaca
Country of publication
United States
University/institution
Cornell University Library arXiv.org
e-ISSN
2331-8422
Source type
Working Paper
Language of publication
English
Document type
Working Paper
Publication history
 
 
Online publication date
2024-12-05
Milestone dates
2024-10-05 (Submission v1); 2024-12-04 (Submission v2)
Publication history
 
 
   First posting date
05 Dec 2024
ProQuest document ID
3141232306
Document URL
https://www.proquest.com/working-papers/retcompletion-high-speed-inference-image/docview/3141232306/se-2?accountid=208611
Full text outside of ProQuest
Copyright
© 2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2024-12-06
Database
ProQuest One Academic