Loading…

Swin-ASNet: An Adaptive RGB-selection Network with Swin Transformer for Retinal Vessel Segmentation

The retinal vasculature reflected by fundus images provides ophthalmologists with information to diagnose eye-related diseases. Therefore, the development of an accurate and automatic retinal vessel segmentation system is critical. Most deep learning-based methods directly use a color image or use a...

Full description

Saved in:
Bibliographic Details
Main Authors: Jin, Qunchao, Hou, Hongyu, Zhang, Guixu, Wang, Haoan, Li, Zhi
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The retinal vasculature reflected by fundus images provides ophthalmologists with information to diagnose eye-related diseases. Therefore, the development of an accurate and automatic retinal vessel segmentation system is critical. Most deep learning-based methods directly use a color image or use a grayscale image simply transformed from the color image as input, and few models focus on the relationship between the three RGB channels. In this paper, we analyze the characteristics of the separated RGB channel images and propose an adaptive RGB-selection network with swin transformer (Swin-ASNet). The input of Swin-ASNet includes the original fundus image and three grayscale images separated from the color image. Our method can select the useful information adaptively through a designed adaptive selection aggregation module. In addition, we adopt the latest swin transformer as the backbone to extract strong features. To better fuse high and low-level features, we design a high-low interaction module, which applies a modified non-local operation under the graph convolution domain. The low-level features are injected into deep semantic information to enhance the detail representation. Experimental results show that our method can achieve state-of-the-art results in three public datasets, comparing with the existing methods.
ISSN:1945-788X
DOI:10.1109/ICME55011.2023.00245