Self-Supervised Masked Convolutional Transformer Block for Anomaly Detection
Document Type
Article
Publication Title
IEEE Transactions on Pattern Analysis and Machine Intelligence
Abstract
Anomaly detection has recently gained increasing attention in the field of computer vision, likely due to its broad set of applications ranging from product fault detection on industrial production lines and impending event detection in video surveillance to finding lesions in medical scans. Regardless of the domain, anomaly detection is typically framed as a one-class classification task, where the learning is conducted on normal examples only. An entire family of successful anomaly detection methods is based on learning to reconstruct masked normal inputs (e.g. patches, future frames, etc.) and exerting the magnitude of the reconstruction error as an indicator for the abnormality level. Unlike other reconstruction-based methods, we present a novel self-supervised masked convolutional transformer block (SSMCTB) that comprises the reconstruction-based functionality at a core architectural level. The proposed self-supervised block is extremely flexible, enabling information masking at any layer of a neural network and being compatible with a wide range of neural architectures. In this work, we extend our previous self-supervised predictive convolutional attentive block (SSPCAB) with a 3D masked convolutional layer, a transformer for channel-wise attention, as well as a novel self-supervised objective based on Huber loss. Furthermore, we show that our block is applicable to a wider variety of tasks, adding anomaly detection in medical images and thermal videos to the previously considered tasks based on RGB images and surveillance videos. We exhibit the generality and flexibility of SSMCTB by integrating it into multiple state-of-the-art neural models for anomaly detection, bringing forth empirical results that confirm considerable performance improvements on five benchmarks: MVTec AD, BRATS, Avenue, ShanghaiTech, and Thermal Rare Event.
First Page
525
Last Page
542
DOI
10.1109/TPAMI.2023.3322604
Publication Date
1-1-2024
Keywords
Abnormal event detection, anomaly detection, attention mechanism, masked convolution, self-attention, self-supervised learning, transformer
Recommended Citation
N. Madan et al., "Self-Supervised Masked Convolutional Transformer Block for Anomaly Detection," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 46, no. 1, pp. 525-542, Jan. 2024, doi: 10.1109/TPAMI.2023.3322604
Additional Links
https://doi.org/10.1109/TPAMI.2023.3322604
Comments
IR Deposit conditions:
OA version (pathway a) Accepted version
No embargo
When accepted for publication, set statement to accompany deposit (see policy)
Must link to publisher version with DOI
Publisher copyright and source must be acknowledged