Document Type
Conference Proceeding
Publication Title
37th International Conference on Machine Learning, ICML 2020
Abstract
We introduce kernels with random Fourier features in the meta-learning framework for few-shot learning. We propose meta variational random features (MetaVRF) to learn adaptive kernels for the base-learner, which is developed in a latent variable model by treating the random feature basis as the latent variable. We formulate the optimization of MetaVRF as a variational inference problem by deriving an evidence lower bound under the meta-learning framework. To incorporate shared knowledge from related tasks, we propose a context inference of the posterior, which is established by an LSTM architecture. The LSTMbased inference network effectively integrates the context information of previous tasks with taskspecific information, generating informative and adaptive features. The learned MetaVRF is able to produce kernels of high representational power with a relatively low spectral sampling rate and also enables fast adaptation to new tasks. Experimental results on a variety of few-shot regression and classification tasks demonstrate that MetaVRF can deliver much better, or at least competitive, performance compared to existing metalearning alternatives.
First Page
11346
Last Page
11356
Publication Date
7-2020
Keywords
Machine learning, Adaptive features, Classification tasks, Context information, Inference network, Latent variable modeling, Meta-learning frameworks, Spectral sampling, Variational inference
Recommended Citation
X. Zhen et al., "Learning to learn kernels with variational random features," 37th International Conference on Machine Learning, ICML 2020, vol. PartF168147-15, pp. 11346 - 11356, Jul 2020.
Comments
Archived thanks to ACM
Open Access
Uploaded 31 January 2024