SipMask: Spatial information preservation for fast image and video instance segmentation

J. Cao, Tianjin University
R.M. Anwer, Mohamed bin Zayed University of Artificial Intelligence
H. Cholakkal, Mohamed bin Zayed University of Artificial Intelligence & Inception Institute of Artificial Intelligence
F.S. Khan, Mohamed bin Zayed University of Artificial Intelligence & Inception Institute of Artificial Intelligence
Y. Pang, Tianjin University
L. Shao, Mohamed bin Zayed University of Artificial Intelligence & Inception Institute of Artificial Intelligence

Abstract

Single-stage instance segmentation approaches have recently gained popularity due to their speed and simplicity, but are still lagging behind in accuracy, compared to two-stage methods. We propose a fast single-stage instance segmentation method, called SipMask, that preserves instance-specific spatial information by separating mask prediction of an instance to different sub-regions of a detected bounding-box. Our main contribution is a novel light-weight spatial preservation (SP) module that generates a separate set of spatial coefficients for each sub-region within a bounding-box, leading to improved mask predictions. It also enables accurate delineation of spatially adjacent instances. Further, we introduce a mask alignment weighting loss and a feature alignment scheme to better correlate mask prediction with object detection. On COCO test-dev, our SipMask outperforms the existing single-stage methods. Compared to the state-of-the-art single-stage TensorMask, SipMask obtains an absolute gain of 1.0% (mask AP), while providing a four-fold speedup. In terms of real-time capabilities, SipMask outperforms YOLACT with an absolute gain of 3.0% (mask AP) under similar settings, while operating at comparable speed on a Titan Xp. We also evaluate our SipMask for real-time video instance segmentation, achieving promising results on YouTube-VIS dataset. The source code is available at https://github.com/JialeCao001/SipMask. Copyright © 2020, The Authors. All rights reserved.