Learning to Learn Kernels with Variational Random Features
Xiantong Zhen*, Haoliang Sun*, Yingjun Du*, Jun Xu, Yilong Yin, Ling Shao, Cees Snoek
Learning to Learn Kernels with Variational Random Features - - PowerPoint PPT Presentation
Learning to Learn Kernels with Variational Random Features Presenter : Haoliang Sun Xiantong Zhen*, Haoliang Sun*, Yingjun Du*, Jun Xu, Yilong Yin, Ling Shao, Cees Snoek ICML | 2020 Meta-Learning (Leaning to Learn) D1 D2 D3 D Datasets
Xiantong Zhen*, Haoliang Sun*, Yingjun Du*, Jun Xu, Yilong Yin, Ling Shao, Cees Snoek
ICML | 2020
t1
t2 t3
D1 D2 D3
Tasks Datasets t’
D’
Meta Learner
Base Learner1 Base Learner2 Base Learner3 new Learner
Meta-Learning.
ICML | 2020
ICML | 2020
Example of few-shot learning setup (Ravi et al., 2017)
Episode 1 Episode 2
ICML | 2020
ICML | 2020
The closed-form solution . The predictor . For task , support set , query set , predictor , base-learner , loss , mapping function , .
ICML | 2020
Ø learn adaptive kernels in a data-driven way Ø leverage the shared knowledge by exploring dependencies among related tasks to generate rich features Ø construct approximate translation-invariant kernels using explicit feature maps via random bases (Bochner’s theorem)
ICML | 2020
Ø The posterior is intractable. Approximate it by using a meta variational distribution Ø The Evidence Lower Bound (ELBO) Ø The objective (maximizing ELBO w.r.t. tasks)
Variational distribution
ELBO
ICML | 2020
Ø generate rich random bases to build strong kernels Ø put the inference of bases
related tasks Ø The context of related tasks
The directed graphical model.
ICML | 2020
Ø LSTM transformation with input of the support set and previous cell states Ø shared MLPs for inference
the variational distribution Ø The optimization objective with the context inference
ICML | 2020
ICML | 2020
5
4
3-shot
1.913 1.072 0.722 0.700
5
3
5-shot
0.415 0.063 0.047 0.022
5
4
10-shot
0.294 0.024 0.009 0.003
ICML | 2020
Figure 1:
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020