首页> 外文会议>International Joint Conference on Neural Networks >Hierarchical Tree Long Short-Term Memory for Sentence Representations
【24h】

Hierarchical Tree Long Short-Term Memory for Sentence Representations

机译:句子表示的层次树长短期记忆

获取原文

摘要

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they can't capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and longterm dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.
机译:NLP领域中的许多机器学习算法都需要固定长度的特征向量。单词嵌入在学习词汇信息方面非常成功。但是,它们无法捕获句子的构成含义,从而使他们无法深入理解语言。在本文中,我们介绍了一种新颖的层次树长短期记忆(HTLSTM)模型,该模型学习任意句法类型和长度的句子的矢量表示。我们建议将一个句子分为三个层次:短短语,长短语和完整句子级别。 HTLSTM模型为我们的算法提供了充分考虑层次信息和语言长期依赖关系的潜力。我们在英语和汉语语料库上设计了实验,以评估我们在情感分析任务上的模型。结果表明,我们的模型明显优于现有的几种现有方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号