首页> 外文会议>International Joint Conference on Artificial Intelligence >Knowledge-enhanced Hierarchical Attention for Community Question Answering with Multi-task and Adaptive Learning
【24h】

Knowledge-enhanced Hierarchical Attention for Community Question Answering with Multi-task and Adaptive Learning

机译:具有多任务和自适应学习的社区问题的知识增强的分层关注

获取原文

摘要

In this paper, we propose a Knowledge-enhanced Hierarchical Attention for community question answering with Multi-task learning and Adaptive learning (KHAMA). First, we propose a hierarchical attention network to fully fuse knowledge from input documents and knowledge base (KB) by exploiting the semantic compositionality of the input sequences. The external factual knowledge helps recognize background knowledge (entity mentions and their relationships) and eliminate noise information from long documents that have sophisticated syntactic and semantic structures. In addition, we build multiple CQA models with adaptive boosting and then combine these models to learn a more effective and robust CQA system. Furthermore, KHAMA is a multi-task learning model. It regards CQA as the primary task and question categorization as the auxiliary task, aiming at learning a category-aware document encoder and enhance the quality of identifying essential information from long questions. Extensive experiments on two benchmarks demonstrate that KHAMA achieves substantial improvements over the compared methods.
机译:在本文中,我们提出了一个关于与多任务学习和自适应学习(Khama)的社区问题的知识增强的分层关注。首先,我们通过利用输入序列的语义构成,提出分层关注网络以完全熔断来自输入文档和知识库(KB)的知识。外部事实知识有助于识别背景知识(实体提及及其关系),并消除具有复杂句法和语义结构的长文档的噪声信息。此外,我们构建具有自适应提升的多个CQA模型,然后将这些模型组合以学习更有效和强大的CQA系统。此外,Khama是一个多任务学习模型。它将CQA视为主要的任务和问题分类作为辅助任务,旨在学习类别感知文档编码器并增强从长话中识别基本信息的质量。两个基准测试的广泛实验表明,Khama通过比较方法实现了大量改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号