Document Type
Article
Publication Title
arXiv
Abstract
In this paper, we introduce variational semantic memory into meta-learning to acquire long-term knowledge for few-shot learning. The variational semantic memory accrues and stores semantic information for the probabilistic inference of class prototypes in a hierarchical Bayesian framework. The semantic memory is grown from scratch and gradually consolidated by absorbing information from tasks it experiences. By doing so, it is able to accumulate long-term, general knowledge that enables it to learn new concepts of objects. We formulate memory recall as the variational inference of a latent memory variable from addressed contents, which offers a principled way to adapt the knowledge to individual tasks. Our variational semantic memory, as a new long-term memory module, confers principled recall and update mechanisms that enable semantic information to be efficiently accrued and adapted for few-shot learning. Experiments demonstrate that the probabilistic modelling of prototypes achieves a more informative representation of object classes compared to deterministic vectors. The consistent new state-of-the-art performance on four benchmarks shows the benefit of variational semantic memory in boosting few-shot recognition. © 2020, CC BY.
DOI
10.48550/arXiv.2010.10341
Publication Date
7-15-2021
Recommended Citation
X. Zhen, Y. Du, H. Xiong, Q. Qui, C.G.M. Snoek, and L. Shao, "Learning to learn variational semantic memory," 2021, arXiv:2010.10341
Comments
Preprint: arXiv