Loading…

Incorporating Superpixel Context for Extracting Building From High-Resolution Remote Sensing Imagery

Extracting building from high-resolution (HR) remote sensing imagery (RSI) serves a variety of areas such as smart city, environment management and emergency disaster services. Previous building extraction methods primarily focus on pixel-level and superpixel-level features, which do not fully utili...

Full description

Saved in:
Bibliographic Details
Published in:IEEE journal of selected topics in applied earth observations and remote sensing 2024-01, Vol.17, p.1-15
Main Authors: Fang, Fang, Zheng, Kang, Li, Shengwen, Xu, Rui, Hao, Qingyi, Feng, Yuting, Zhou, Shunping
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Extracting building from high-resolution (HR) remote sensing imagery (RSI) serves a variety of areas such as smart city, environment management and emergency disaster services. Previous building extraction methods primarily focus on pixel-level and superpixel-level features, which do not fully utilize the superpixel-level spatial context, leaving room for performance improvement. To bridge the gap, this study incorporates spatial context of both pixels and superpixels for building extraction of HR RSI. Specifically, the proposed method develops a trainable superpixel segmentation module to segment HR RSI into superpixels by fusing pixel features and pixel-level context. And a superpixel-level context aggregation module is devised to incorporate the multiple-scale spatial context of superpixels to extract buildings. Experiments on public challenging datasets show that our method is superior to the state-of-the-art baselines in accuracy, with better building boundaries and higher integrity. This study explores a new approach for HR RSI building extraction by introducing spatial context of superpixels, and a methodological reference for the HR RSI interpretation tasks.
ISSN:1939-1404
2151-1535
DOI:10.1109/JSTARS.2023.3337140