A Coarse-to-Fine Facial Landmark Detection Method Based on Self-attention Mechanism
Document Type
Article
Publication Title
IEEE Transactions on Multimedia
Abstract
Facial landmark detection in the wild remains a challenging problem in computer vision. Deep learning-based methods currently play a leading role in solving this. However, these approaches generally focus on local feature learning and ignore global relationships. Therefore, in this study, a self-attention mechanism is introduced into facial landmark detection. Specifically, a coarse-to-fine facial landmark detection method is proposed that uses two stacked hourglasses as the backbone, with a new landmark-guided self-attention (LGSA) block inserted between them. The LGSA block learns the global relationships between different positions on the feature map and allows feature learning to focus on the locations of landmarks with the help of a landmark-specific attention map, which is generated in the first-stage hourglass model. A novel attentional consistency loss is also proposed to ensure the generation of an accurate landmark-specific attention map. A new channel transformation block is used as the building block of the hourglass model to improve the model's capacity. The coarse-to-fine strategy is adopted during and between phases to reduce complexity. Extensive experimental results on public datasets demonstrate the superiority of our proposed method against state-of-the-art models.
First Page
926
Last Page
938
DOI
10.1109/TMM.2020.2991507
Publication Date
4-30-2020
Keywords
Convolutional neural network, facial landmark detection, self-attention mechanism
Recommended Citation
P. Gao, K. Lu, J. Xue, L. Shao and J. Lyu, "A Coarse-to-Fine Facial Landmark Detection Method Based on Self-attention Mechanism," in IEEE Transactions on Multimedia, vol. 23, pp. 926-938, 2021, doi: 10.1109/TMM.2020.2991507
Additional Links
DOI link: https://doi.org/10.1109/TMM.2020.2991507
Comments
IR Deposit conditions:
OA version (pathway a) Accepted version
No embargo
When accepted for publication, set statement to accompany deposit (see policy)
Must link to publisher version with DOI
Publisher copyright and source must be acknowledged