Scale-Aware Graph Neural Network for Few-Shot Semantic Segmentation
Document Type
Conference Proceeding
Publication Title
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Abstract
Few-shot semantic segmentation (FSS) aims to segment unseen class objects given very few densely-annotated support images from the same class. Existing FSS methods find the query object by using support prototypes or by directly relying on heuristic multi-scale feature fusion. However, they fail to fully leverage the high-order appearance relationships between multi-scale features among the support-query image pairs, thus leading to an inaccurate localization of the query objects. To tackle the above challenge, we propose an end-to-end scale-aware graph neural network (SAGNN) by reasoning the cross-scale relations among the support-query images for FSS. Specifically, a scale-aware graph is first built by taking support-induced multi-scale query features as nodes and, meanwhile, each edge is modeled as the pairwise interaction of its connected nodes. By progressive message passing over this graph, SAGNN is capable of capturing cross-scale relations and overcoming object variations (e.g., appearance, scale and location), and can thus learn more precise node embeddings. This in turn enables it to predict more accurate foreground objects. Moreover, to make full use of the location relations across scales for the query image, a novel self-node collaboration mechanism is proposed to enrich the current node, which endows SAGNN the ability of perceiving different resolutions of the same objects. Extensive experiments on PASCAL-5i and COCO-20i show that SAGNN achieves state-of-the-art results.
First Page
5471
Last Page
5480
DOI
10.1109/CVPR46437.2021.00543
Publication Date
11-13-2021
Recommended Citation
G-S. Xie, J. Liu, H. Xiong, and L. Shao, "Scale-aware graph neural network for few-shot semantic segmentation," 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2021, pp. 5471-5480, doi: 10.1109/CVPR46437.2021.00543.
Comments
IR deposit conditions: none described
Open access version available through CVF Open Access