Content provided by Yinbo Chen, the first author of the paper A New Meta-Baseline for Few-Shot Learning.
Meta-learning has become a popular framework for few-shot learning in recent years. While more and more novel meta-learning models are being proposed, our research has uncovered simple baselines that have been overlooked. We further analyse the potential reasons that make this simple method successful. We observe an objective discrepancy in the meta-learning stage, and we find both pre-training and inheriting a good few-shot classification metric from the pre-trained classifier are important for Meta-Baseline, which potentially helps the model better utilize the pre-trained representations with stronger transferability. Our work sets up a new solid benchmark for this field and sheds light on further understanding the phenomenons in the meta-learning framework for few-shot learning.
What’s New: A new simple baseline for few-shot learning that achieves state-of-the-art performance; The analysis on base class generalization.
How It Works: We present a Meta-Baseline method, by pre-training a classifier on all base classes and meta-learning on a nearest-centroid based few-shot classification algorithm, it outperforms recent state-of-the-art methods by a large margin.
Key Insights: One simple method (Meta-Baseline) has been overlooked; Solving the discrepancy issue between base class generalization and novel class generalization is potentially a key challenge to tackle in few-shot learning. It presents a surprisingly simple method which outperforms recent state-of-the-art methods in few-shot learning, and rethinks about the potential future directions in this field.
The paper A New Meta-Baseline for Few-Shot Learning is on arXiv.
Share My Research is Synced’s new column that welcomes scholars to share their own research breakthroughs with over 1.5M global AI enthusiasts. Beyond technological advances, Share My Research also calls for interesting stories behind the research and exciting research ideas. Share your research with us by clicking here.