首页> 外文期刊>Computer speech and language >Transfer fine-tuning of BERT with phrasal paraphrases
【24h】

Transfer fine-tuning of BERT with phrasal paraphrases

机译:用短语释义转移伯特的微调

获取原文
获取原文并翻译 | 示例
           

摘要

Sentence pair modelling is defined as the task of identifying the semantic interaction between a sentence pair, i.e., paraphrase and textual entailment identification and semantic similarity measurement. It constitutes a set of crucial tasks for research in the area of natural language understanding. Sentence representation learning is a fundamental technology for sentence pair modelling, where the development of the BERT model realised a breakthrough. We have recently proposed transfer fine-tuning using phrasal paraphrases to allow BERT'S representations to be suitable for semantic equivalence assessment between sentences while maintaining the model size. Herein, we reveal that transfer fine-tuning with simplified feature generation allows us to generate representations that are widely effective across different types of sentence pair modelling tasks. Detailed analysis confirms that our transfer fine-tuning helps the BERT model converge more quickly with a smaller corpus for fine-tuning.
机译:句子对建模被定义为识别句子对之间的语义交互,即释义和文本意外识别和语义相似度测量之间的任务。它构成了一系列关键任务,用于研究自然语言理解领域。句子表示学习是句子对建模的基本技术,其中伯特模型的发展实现了突破。我们最近建议使用短语释义来调整微调,以允许BERT的表示适用于句子之间的语义等同评估,同时保持模型大小。这里,我们揭示了具有简化特征生成的传输微调允许我们生成跨不同类型的句子对建模任务广泛有效的表示。详细分析证实,我们的转移微调可帮助BERT模型更快地通过较小的语料库进行微调。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号