Abstract

Swin Transformer is an important work among all the attempts to reduce the computational complexity of Transformers while maintaining its excellent performance in computer vision. Window-based patch self-attention can use the local connectivity of the image features, and the shifted window-based patch self-attention enables the communication of information between different patches in the entire image scope. Through in-depth research on the effects of different sizes of shifted windows on the patch information communication efficiency, this article proposes a Dual-Scale Transformer with double-sized shifted window attention method. The proposed method surpasses CNN-based methods such as U-Net, AttenU-Net, ResU-Net, CE-Net by a considerable margin (Approximately 3% 6% increase), and outperforms the Transformer based models single-scale Swin Transformer(SwinT)(Approximately 1% increase), on the datasets of the Kvasir-SEG, ISIC2017, MICCAI EndoVisSub-Instrument and CadVesSet. The experimental results verify that the proposed dual scale shifted window attention benefits the communication of patch information and can enhance the segmentation results to state of the art. We also implement an ablation study on the effect of the shifted window size on the information flow efficiency and verify that the dual-scale shifted window attention is the optimized network design. Our study highlights the significant impact of network structure design on visual performance, providing valuable insights for the design of networks based on Transformer architectures.

Details

Title
Dual-scale shifted window attention network for medical image segmentation
Author
Han, De-wei 1 ; Yin, Xiao-lei 2 ; Xu, Jian 1 ; Li, Kang 1 ; Li, Jun-jie 1 ; Wang, Lu 2 ; Ma, Zhao-yuan 1 

 Southern University of Science and Technology, School of System Design and Intelligent Manufacturing, Shenzhen, China (GRID:grid.263817.9) (ISNI:0000 0004 1773 1790) 
 Tsinghua University, The Future Laboratory, Haidian District, China (GRID:grid.12527.33) (ISNI:0000 0001 0662 3178) 
Pages
17719
Publication year
2024
Publication date
2024
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3086483239
Copyright
© The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.