首页> 外文会议>Chinese Lexical Semantics Workshop >Learning Term Weight with Long Short-Term Memory for Question Retrieval
【24h】

Learning Term Weight with Long Short-Term Memory for Question Retrieval

机译:学习术语重量,短期内记忆是问题检索

获取原文

摘要

Most of previous methods on question retrieval treat all words as equally important. This paper employs a bidirectional long short-term memory network to predict word salience weight in the question, which is hinted by the word's matching status in the answer. Our method is trained on a large corpus of natural question-answer pairs, and so it requires no human annotation. We conduct experiments on question retrieval in a cQA dataset. The results show that our model outperforms traditional methods by a wide margin.
机译:以前的大多数方法如何检索处理所有单词同样重要。本文采用双向长期内记忆网络,以预测问题中的Word Parience重量,这些重量由答案中的单词匹配状态暗示。我们的方法培训了在一个自然问题答案对的大语料库上,因此它不需要人类注释。我们在CQA DataSet中对问题进行问题进行实验。结果表明,我们的模型优于传统方法宽边缘。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号