首页> 外文会议>International Conference on Computational Linguistics >Autoregressive Reasoning over Chains of Facts with Transformers
【24h】

Autoregressive Reasoning over Chains of Facts with Transformers

机译:与变压器的事实链的自动报告推理

获取原文

摘要

This paper proposes an iterative inference algorithm for multi-hop explanation regeneration, that retrieves relevant factual evidence in the form of text snippets, given a natural language question and its answer. Combining multiple sources of evidence or facts for multi-hop reasoning becomes increasingly hard when the number of sources needed to make an inference grows. Our algorithm copes with this by decomposing the selection of facts from a corpus autoregres-sively, conditioning the next iteration on previously selected facts. This allows us to use a pair-wise learning-to-rank loss.We validate our method on datasets of the TextGraphs 2019 and 2020 Shared Tasks- for explanation regeneration. Existing work on this task either evaluates facts in isolation or artificially limits the possible chains of facts, thus limiting multi-hop inference. We demonstrate that our algorithm, when used with a pre-trained transformer model, outperforms the previous state-of-the-art in terms of precision, training time and inference efficiency.
机译:本文提出了一种用于多跳解释再生的迭代推理算法,其以文本片段的形式检索相关的事实证据,给予自然语言问题及其答案。当使推理所需的源数量增长所需的来源数量时,组合多跳的证据或事实的多次证据或事实变得越来越困难。我们的算法通过分解来自语料库自动凝视的事实的选择,调节在先前所选事实上的下一次迭代来解决此问题。这使我们能够使用对学习 - 排名丢失。我们在TextGraphs 2019和2020共享任务的数据集上验证我们的方法 - 用于解释再生。在此任务上的现有工作要么在隔离或人为地限制可能的事实链条,从而限制多跳推断。我们展示了我们的算法,当与预先训练的变压器模型一起使用时,在精度,训练时间和推理效率方面优于先前的最先进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号