首页> 外文会议>China National Conference on Computational Linguistics >WAE_RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence
【24h】

WAE_RN: Integrating Wasserstein Autoencoder and Relational Network for Text Sequence

机译:wae_rn:为文本序列集成Wassersein AutoEncoder和关系网络

获取原文

摘要

One challenge in Natural Language Processing (NLP) area is to learn semantic representation in different contexts. Recent works on pre-trained language model have received great attentions and have been proven as an effective technique. In spite of the success of pre-trained language model in many NLP tasks, the learned text representation only contains the correlation among the words in the sentence itself and ignores the implicit relationship between arbitrary tokens in the sequence. To address this problem, we focus on how to make our model effectively learn word representations that contain the relational information between any tokens of text sequences. In this paper, we propose to integrate the relational network (RN) into a Wasserstein autoen-coder (WAE). Specifically, WAE and RN are used to better keep the semantic structurse and capture the relational information, respectively. Extensive experiments demonstrate that our proposed model achieves significant improvements over the traditional Seq2Seq baselines.
机译:在自然语言处理(NLP)区域中的一个挑战是在不同的背景下学习语义表示。最近在训练有素的语言模型上的作品受到了很大的关注,并被证明是一种有效的技术。尽管在许多NLP任务中进行了预先训练的语言模型的成功,但学习的文本表示只包含句子本身中的单词之间的相关性,并忽略序列中任意令牌之间的隐式关系。为了解决这个问题,我们专注于如何使我们的模型有效地学习包含文本序列的任何令牌之间的关系信息的字表示。在本文中,我们建议将关系网络(RN)集成到WASSERSEIN AUTOREN-CODER(WAE)中。具体地,WAE和RN用于更好地保持语义结构并分别捕获关系信息。广泛的实验表明,我们的拟议模型实现了传统的SEQ2Seq基线的显着改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号