首页> 外文会议>3rd workshop on representation learning for NLP 2018 >Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding
【24h】

Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding

机译:非自回归卷积解码加速基于上下文的句子表示学习

获取原文
获取原文并翻译 | 示例

摘要

Context plays an important role in human language understanding, thus it may also be useful for machines learning vector representations of language. In this paper, we explore an asymmetric encoder-decoder structure for unsupervised context-based sentence representation learning. We carefully designed experiments to show that neither an autoregressive decoder nor an RNN decoder is required. After that, we designed a model which still keeps an RNN as the encoder, while using a non-autoregressive convolutional decoder. We further combine a suite of effective designs to significantly improve model efficiency while also achieving better performance. Our model is trained on two different large unlabelled corpora, and in both cases the transferability is evaluated on a set of downstream NLP tasks. We empirically show that our model is simple and fast while producing rich sentence representations that excel in downstream tasks.
机译:上下文在人类语言理解中起着重要作用,因此对于机器学习语言的矢量表示也可能很有用。在本文中,我们探索了一种用于非监督基于上下文的句子表示学习的非对称编码器-解码器结构。我们精心设计的实验表明,既不需要自回归解码器也不需要RNN解码器。之后,我们设计了一个模型,该模型仍使用RNN作为编码器,同时使用非自回归卷积解码器。我们进一步结合了一套有效的设计,以显着提高模型效率,同时也获得更好的性能。我们的模型是在两个不同的未标记大型语料库上训练的,在这两种情况下,都是在一组下游NLP任务上评估可传递性。我们凭经验表明,我们的模型简单而快速,同时产生了在下游任务中表现出色的丰富句子表示形式。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号