Deep neural network models with strong feature extraction capacity are prone to overfitting and fail to adapt quickly to new tasks with few samples. Gradient-based meta-learning approaches can minimize overfitting and adapt to new tasks fast, but they frequently use shallow neural networks with limited feature extraction capacity. We present a simple and effective approach called Meta-Transfer-Adjustment learning (MTA) in this paper, which enables deep neural networks with powerful feature extraction capabilities to be applied to few -shot scenarios while avoiding overfitting and gaining the capacity for quickly adapting to new tasks via training on numerous tasks. Our presented approach is classified into two major parts, the Feature Adjustment (FA) module, and the Task Adjustment (TA) module. The feature adjustment module (FA) helps the model to make better use of the deep network to improve feature extraction, while the task adjustment module (TA) is utilized for further improve the model's fast response and generalization capabilities. The proposed model delivers good classification results on the benchmark small sample datasets MiniImageNet and Fewshot-CIFAR100, as proved experimentally.
展开▼