首页> 外文会议>International Conference on Computational Linguistics >Effective Few-Shot Classification with Transfer Learning
【24h】

Effective Few-Shot Classification with Transfer Learning

机译:转移学习有效的几次拍摄分类

获取原文

摘要

Few-shot learning addresses the the problem of learning based on a small amount of training data. Although more well-studied in the domain of computer vision, recent work has adapted the Amazon Review Sentiment Classification (ARSC) text dataset for use in the few-shot setting. In this work, we use the ARSC dataset to study a simple application of transfer learning approaches to few-shot classification. We train a single binary classifier to learn all few-shot classes jointly by prefixing class identifiers to the input text. Given the text and class, the model then makes a binary prediction for that text/class pair. Our results show that this simple approach can outperform most published results on this dataset. Surprisingly, we also show that including domain information as part of the task definition only leads to a modest improvement in model accuracy, and zero-shot classification, without further fine-tuning on few-shot domains, performs equivalently to few-shot classification. These results suggest that the classes in the ARSC few-shot task, which are defined by the intersection of domain and rating, are actually very similar to each other, and that a more suitable dataset is needed for the study of few-shot text classification.
机译:少量学习解决了基于少量训练数据的学习问题。虽然在计算机愿景领域更熟练地研究,但最近的工作适用于亚马逊审查情绪分类(ARSC)文本数据集,用于少量拍摄设置。在这项工作中,我们使用ARSC数据集进行简单地应用转移学习方法对少拍摄分类。我们培训单个二进制分类器,通过将类标识符前缀到输入文本来共同了解所有几张拍摄类。鉴于文本和类,该模型然后为该文本/类对进行二进制预测。我们的结果表明,这种简单的方法可以在此数据集上倾斜大多数发布结果。令人惊讶的是,我们还表明包括域信息作为任务定义的一部分,只能导致模型精度的适度改进,而零拍摄分类,而不是在几个拍摄域上进行进一步进行微调,同等地执行少量分类。这些结果表明,由域和评级的交叉点定义的ARSC的类实际上彼此非常相似,并且研究几次拍摄文本分类所需的更合适的数据集。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号